Jump to content


Gorilla

Established Members
  • Posts

    26
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by Gorilla

  1. Good point. I forget about stand alone WSUS after spending so much time with SUP. I'll use that jargon in the future.
  2. Thanks a ton Peter. It appeared that was how it functioned. I will make sure to use the "ignore maintenance window" option for any deployments that are offered insted of mandatory, as the application deployment management strategy I'm using only really has Offered and Mandatory advertisements. And I assume this effects WSUS updates too (Maint. Windows) but those only get pushed out in the night and it would be an improvement to prevent missed ones from running during business hours. I had been "dialing-in" slowly and measuring impact. Seems judicially used and properly configured MW's aren't an option. Thanks again!
  3. I am trying to perfect my managed application strategy. Using this scenario, please advise. Collection A has 50 computers in it and no maintenance windows. Advertisement 1 is created, targets Collection A, and is mandatory so WoL can be used to turn on the computers during off-hours. 1 month later, Computer 51 must receive this required application after hours using WoL and is added to Collection A. Question: In this situation, is the 1 month old mandatory assignment seen as passed and Computer 51 will begin installing as soon as it recieves the job and package? Or does the mandatory assignment only apply to computers in the collection at the time it triggers and then just stay dormant? I could address this many ways depending on the behavior. Assuming it doesn't just trigger and die and applies to computers added to the collection later , the options I see are: 1) Add Maintenance Windows to managed application collections to allow them to run only after hours, or; 2) Manually delete the mandatory assignment (not the advertisement) after it runs and then add a new one whenever a new client is added and set re-run behavior to re-run only if failed to prevent it from running again on the current computers. Can anyone validate this strategy and my comprehension of how this will function. Am I missing any attractive options? My preference would be to use #1 since it's a 'set-it and forget-it' solution and I'm ambitiously lazy. #2 is just for kicks and to make sure I understand the nuances of managing mandatory assignments and reducing impact to production systems during production hours. I could test this myself, but IT is already a 'spend-a-lot-of-time focusing alone' job. Sometimes I like to hop on here and get opinions if I cant find an answer. And as I couldn't find one, hopefully it will benefit posterity too. Thanks everyone!
  4. I have an application that points at a license server. The server name is in a HKLM/Software/<name>\License\<dword value> registry key. I need to update the value. I've done lot's of login scripts that import registry settings and wanted to use SCCM to hopefully have a tested way I can execute a Reg command on clients to get away from the login script model. These are all 32-bit XP clients. So basically I have a DWORD value of <servername> in a .reg file with the command regedit /s <file>.reg (and I've tried Reg /import as suggested) My problem it turns out is executing the command while having having the reg file on a network share. I have both tried using a UNC and a mapped drive letter and it fails. If I place the file on a local drive, it works. I had hoped that SCCM abstracted that when I built a package but I'm guessing that's my problem. So unless I learn otherwise, I'll just put the logic in a script (copy file locally > execute file > check value > delete file) and at that point, I don't even need SCCM. Though if it's possible, I'd like to understand the best practice methodology. I've never run out of situations where a registry value needs to be added, deleted, or modified on a host of clients.
  5. Thanks for the response. You are absolutely right that I both forget how good the help is in SCCM (not used to that!) and that I didn't use it. I needed an excuse to visit your site again anwyay. :>) If it's any consolation, the help didn't help me in this case though. Just got back to checking the status of my task. It reported as succeeded but then when I checked, the registry value was not updated. Which is strange because it did work in my test with one machine. All I changed was the collection. No I do not have the file locally and I hope that I won't have to as that would be hard to do on all clients for advertised tasks. Instead I created a package on a network share that is accessible by the install base. And it did work when I tested it, so now I'm thinking maybe they are one hit wonders and I needed to re-advertise it when I changed the collection and schedule. I'll need to mull this over tomorrow so if you see any obvious misconceptions on my part, I'll appreciate having them pointed out. Cheers! -Kelly
  6. I'm using a task sequence to only run a command. For example: regedit /s <file>.reg I'm setting one up for the first time and I created the task, edited it, added the Run Command Line task, and then placed the above in the command line field. Then I placed the .reg file on a network share and placed that path in the Start In: field. Finally I advertised it to a collection with one workstation for as soon as possible and made it mandatory with WoL. It didn't work until I abandoned the "Start In" field and created a package out of the reg file. So what is the purpose of the 'Start In' field if it isn't the working directory? I didn't change the location of the file, only made a package out of it and cleared the 'Start In' field. So while I've figured out a way to do it, every reference I've found to the 'Start In' field in task sequences touts it as the working directory. Looking for any clarity in how Task Sequences used for Command Line tasks should be leveraged, especially if there's a file involved. Thanks a ton!
  7. Okay my trials and tribulations shall shine for posterity. It worked! First time so wasn't sure what to look for. Here's what a successful statelog.sys summarization looks like (append to above): SQL MESSAGE: spTask_SUM_UpdateStatusSummarizer - 11:05:01:057: summarizing status for CI 58630 SMS_STATE_SYSTEM SQL MESSAGE: spTask_SUM_UpdateStatusSummarizer - 11:05:01:057: summarizing status for CI 58632 SMS_STATE_SYSTEM SQL MESSAGE: spTask_SUM_UpdateStatusSummarizer - 11:05:01:073: summarizing status for CI 58641 SMS_STATE_SYSTEM SQL MESSAGE: spTask_SUM_UpdateStatusSummarizer - 11:05:01:073: summarizing status for CI 58648 SMS_STATE_SYSTEM SQL MESSAGE: spTask_SUM_UpdateStatusSummarizer - 11:05:01:073: summarizing status for CI 58657 SMS_STATE_SYSTEM SQL MESSAGE: spTask_SUM_UpdateStatusSummarizer - 11:05:26:730: spTask_SUM_UpdateStatusSummarizer done SMS_STATE_SYSTEM Task 'SUM Update Status Summarizer' completed successfully after running for 735 seconds, with status 20703. SMS_STATE_SYSTEM It took 13.5 minutes to complete. Upon Refreshing VIOLA! I'm going to chug some coffee and do a jig. Oh then I'm going to secure my clients. :>)
  8. Bah no joy. Statesys.log reported: Found new state messages to process, starting processing thread SMS_STATE_SYSTEM 8/11/2010 11:00:30 AM 7916 (0x1EEC) Thread "State Message Processing Thread #0" id:6668 started SMS_STATE_SYSTEM 8/11/2010 11:00:30 AM 6668 (0x1A0C) Thread "State Message Processing Thread #0" id:6668 terminated normally SMS_STATE_SYSTEM 8/11/2010 11:00:30 AM 6668 (0x1A0C) CThreadMgr::ThreadTerminating - All threads have stopped running SMS_STATE_SYSTEM 8/11/2010 11:00:30 AM 6668 (0x1A0C) I then refreshed the home page sumamrization and all updates since MS10-46 show as Unknown still. I'm relying on this article a bit: http://technet.microsoft.com/en-us/library/bb632932.aspx Any report that relies on Update Lists can't help me since I use the Search Folder, which seems predicated on the Home Page summarization, to decide what to add to the update list. So if I can't rely on reports that need accurate update lists, and I can't tickle the Summary page to be accurate, I am at a loss for a graceful method. For now I'm going to create an ad hoc update list out of the new updates purely to run a report and see what I need to approve. This is a lot more work then I use to do with WSUS alone, and while I see a lot of long term benefit to the SCCM-way, this kind of funkiness is disarming. So I'm open to ideas.
  9. Hold steady...I just found the Home Page Summarization feature. Waiting for statesys.log to report it's complete, then hopefully a refresh and an updated to my process is all that's needed! Will report back, hopefully soon. How long can this take? LOL
  10. I rely (should I - is there a better way?) on the Search Folders' Installed / Required / Not Required columns to add updates to my monthly security Update List. I am failing to comprehend when and how it is updated. I just took one client, did a Software Update scan, watched it finish in the logs, then ran the Compliance 6 - Specific Computer report and saw that it requires some of the new updates. I did the same for my SCCM server. Same results. However when I am in the Search Folder and want to add only the updates required by a client to the appropriate update list, all the new ones are listed under the Unknown column. So I ran a client Software Scan sync on one client and one server. Checked the logs to ensure it finished. Ran the Compliance 6 report and saw that some of these new updates are seen as required. So here are my questions: 1) How do I get the Search Folder pane to update the compliance columns? What triggers that normally? 2) Is there a silver bullet for figuring out which new updates to add to your update lists if these columns aren't populated? 3) Is there a way to remotely trigger an entire collection to scan for software compliance? I can do it per DDR using Client Center right click functionality, but the update scan isn't available on a collection. Based on the reports I'm running, it does seem that I can find out in a very convoluted round-about way which updates are required by my different architecture's. I find the search folder columns invaluable in making decisions on which updates to add to which update lists each month. And I would have thought they look at the same table and data that the reports are looking at, but apparently I'm still missing some of the magic. Since the reports know, I believe my compliance scans are happening. So my guess is this is just about Question #1 and the other two can be ignored if that's the case and there's a way for me to update those with the data the DB obviously has. I'm reluctant to add the Required value to my search folders until I see and understand when that will manifest in conjunction with syncs and scans. I created one to test and even though SCCM scanned and knows it needs some updates, it doesn't list these as required in the Search Folder. Thanks for any and all assistance. -Kelly
  11. Assuming settings are part of configuring SUP, I'll post this question here. When updates are configured to not require a client to reboot (For example: emergency dispatch systems) automatically, I get the package and red arrow in the system tray. Is there a way to force visible reminders to reboot? I'd like something flashier and more pesky than the little box and arrow. I've been through all the settings and all I find is reminders that a reboot is going to happen, but when one is needed and not forced, can SCCM be configured to pester systems that are always on until they reboot? Much thanks for any and all ideas. -Kelly
  12. YES! The obviously named ScanAgent.log is perfect. Unrelated to scanning, I also use the WUAHandler.log as it let's me know the status of updates being installed and such. I also verified a scan does happen after a deployment. I am very interested in caching behavior and the Technet article handled that as well. I suspected and am happy to learn that it does verify an update is still needed before downloading files to the cache. Does it only pre-cache for mandatory deployments? I have server admins who need to manually install updates and I'd like to alleviate them of having to wait for the download. Any methods for making an update available but not mandatory and having the files in the client cache ahead of time? Thanks Niall. This saved me some digging time. I've added this information to my knowledgebase.
  13. Please correct any misunderstandings here: Software Update Compliance Scanning is via the Software Updates Client Agent and can be evoked manually or automatically. Manual scans can be called from the client or remotely. Automatic scans, when they occur, and how you can monitor their progress are what I'd like to capture in this thread. Your automatic scanning interval is set in the Software Updates Client Agent Properties on the General tab. I *believe* scans also happen immediately after a scheduled deployment finishes. Can someone corroborate whether a compliance scan occurs in relation to a deployment, and at what point please? I rely on the Scan 1 - Last Scan States By Collection report. When it indicates a scan is currently running, is there a log file or method I could use on the client that is running a scan to monitor its progress? I want to figure out what's occurring during a scan and how long it takes a particular client to conduct one. Is there a better way to monitor scan states? My understanding is that scanning is done by the client but triggered or scheduled and involves reporting back to SUP via IIS. So scanning issues might only be a reporting issue which could be IIS and not indicative of a client failing to scan. Is there a file in the outbox (?) or somewhere on the client that one could verify a clients' scan results if reporting is suspect? Do I have anything wrong there and can anyone build on this? Thanks!
  14. It's alive. Alive I say. Go go gadget updates.

  15. There is no switch between the test server and client. Same subnet. And yes the port change worked. So somehow, Port 9 is broken on the server side. Strange days indeed. My spidey senses told me to try a port change hours ago, but my common sense told me that didn't make any sense. But as any good troubleshooter must do, I went back to basics and walked the tree. Alas the senseless prevails! Mr. Spock would never have cut it in this industry.
  16. Okay so oddly enough, in desperation, I changed SCCM WOL Unicast UDP packets from Port 9 to a dynamic port (45k+) and bam it fired! Go figure. I haven't the foggiest idea how just Port 9 could be on strike, but I have a workaround now that doesn't require Subnet broadcasts. If I figure out the Port 9 thing, because I'm damn curious, I'll post the solution. I still need to figure out why my DST patch isn't working. Currently padding deployments by adding an hour.
  17. Throwing one out there for those either successfully using Unicast Wake-Up Packets from SCCM or whom have an idea for what I'm going through. SCCM 2007 R2 on Server 2008 SP2 (One MP/DP w/ Remote SQL2005 DB) is NOT sending out Unicast Wake-Up packets any longer. Subnet-Directed works. I *had* Unicast working and tested, but then there was a WSUS version upgrade snafu that wiped all the WSUS bits. Indeed it wiped them better than the MS MSI Uninstaller. Not a trace. I was forced to reinstall the WSUS SP3 bits again as stand alone and reinstall my SUP. Okie doke we're up and running again except for two bones. 1) The DST bug is back. My WOLMGR.DLL is the old version and that's after several successful re-installations of the Hot Fix that previously fixed it. Not a big deal since I can pad deployment times by an hour and it works until I get the new DLL to stick. But anyone whose seen this or has an idea, I'm chomping the bit here. 2) Unicast WOL packets haven't worked since the Great WSUS Crash of 2010. I mean they don't get sent out. No Router between. One Subnet. Sniffer shows UDP WOL packet going out from the server when I manually send a WOL packet to the client MAC. So I know it's not a switch or ARP table issue. I know the client wakes up and is compatible. I know a packet can be sent from the server. When configuring SCCM to send a wake-up packet via Unicast, the log files (WOLMGR and WOLCMGR) all fire and show the packet being sent, but the sniffer just glares back at me balefully. I switch to Subnet-Directed, update the deployment schedule by a few minutes, watch the logs fire, wait the 57 seconds or so that the log says and then BAM! Sniffer shows the packet. Clients wake up. Checked DDR's and INV. Mac's are right. IP's are correct. DHCP shows proper lease. Name resolution from server works. No errors anywhere of any kind. No Maintenance Windows assigned. I'm out of bullets. WOL reports show that they are scheduled and seemingly sent, and if not for the sniffer I wouldn't know they weren't being sent and just not firing up for some reason. Anyone with any insight as to how one can tickle Unicast for WOL (I did the IT Radio Button tickle already) so that the server starts using them again might prevent me from stabbing myself in the left eye with a ball point pen. Since there are not separate Server requirements I am aware of for one type of WOL transmission over the other, Subnet-Directed wouldn't be working if I didn't have my Site, Collections, and Deployments properly configured. To be clear, all I change is the WOL Packet type at the Site Properties and it works. Unicast stopped working when I had to reinstall WSUS 3.0 SP2 bits and reinstall SUP. I'm not using Out of Band Management currently so Power On packets aren't an option. Thanks for any and all assistance! I'll update with solutions as I find them.
  18. Argh I'm a dunderhead! I'm still not completely accepting of how awesome MS has made the SCCM help files. I admit I was being lazy. Thanks! Make perfect sense as usual.
  19. So it was bound to happen. Despite all the great information, I stumbled on to the button that threw my understanding into a tailspin. I'm going to play around today with I but wanted to ask both because I'm approaching a deadline and play time needs to be limited, but more for posterity. ;>) If we don't re-use deployment management tasks, and if they are created from a package, what's the deal with the ADD button on the Software Updates tab of a Deployment Tasks properties page? I *thought* this all made sense and that I added files to the package and deployed the package. Now there's a button that let's me ADD updates to a task. Is anyone familiar with this button and why it exists? The Add Software Updates window it invokes looks like WSUS and not SCCM. There are no Search Folders. I want to try identifying an update I was going to add to my package and try adding it to the deployment and see what happens. If that doesn't seem to do anything, I'll try updating the package with the same file and then re-advertise and see if it works. I'm just curious and if anyone wants to talk me off the ledge, I do have better things to do. lol
  20. I'm using WOL enabled updates and it works great. Given that you can be very granular with Collections, the sky is the limit and I recommend basing your design on what the business outcome is. So for example, if you have a work group that shows up at 8:00 AM, setting a maint. window from say 7:00 - 7:45 should see most updates installed before they show up leaving 15 minutes for reboots and early risers. You can tweak down the start time if you need longer windows, and the job won't start if there isn't time to complete the install in the maint. window. Eventually there will be options for powering off clients but that can be part of continual service improvement and is a great benefit to market later. Getting your SCCM collections / search folders / templates / and update lists dialed in so that the right groups cache downloads and exhibit the end-user experience (notification and reboots) you're striving to deliver can take a bit of time and testing. I would recommend not waiting because maint. windows can minimize how long a computer is on after you deploy software. You can consider making a task sequence to look for logged on users and then issue a shutdown to the client if it isn't in use. Hard to know since it will be based on what your customer wants to occur. But there are enough tools. SCCM will let you deliver anything you can imagine pretty much. It just takes some creativity at times. :>)
  21. Sounds good. Appreciate the clarification. Makes sense. You mentioned that as part of your process you update the deployment management task. However I'm now going with the idea that I will delete my old deployment management task since I'm creating a new one which includes all the previous updates (sans expired and superseded) from the old update list plus any new ones I just added. Are you keeping the "superseded" deployment tasks around for some reason or deleting them when you approve the new one? If I first prune the update list of unneeded updates, why would a I prune a deployment task I'm going to delete? That's the only part I'm confused about. So here's a quick example: UpdateList1 = 3 updates for XP clients. It is deployed successfully to all clients via Deployment (advertisement) Task "XP Clients". Next month 2 new updates come out. I add these updates to UpdateList1. Then I add the new updates to the exsiting source package. Now when I use the current UpdateList1 to create the new advertisement,"XP Clients", it will include the updates it's advertising + the 2 new ones I added. I delete the old advertisement (no name conflicts). I create the new one. Voila! It sounds like you might be keeping the previous task(s) around?? That's where I'm confused. If so, why? Thanks for the quick and helpful response! You rock.
  22. Thanks in part to the great articles and support here, my SCCM powered WSUS updates are flowing smooth like butter kept in the pantry. Many thanks! So the last principal I'm getting my head around is the actual deployment. I have search folders, templates, lists, deployment packages, shares configured, etc. Is there a way to reuse the same deployment? I ask because it seems it would be easier than having to delete the old deployments. Do you have any insights or strategies into managing the deployment themselves? I would imagine over time there would be many and that at any time you can delete any deployment and within minutes have new deployments created based on "dialed-in" templates and source packages. But if I could just re-use them by adding the new updates and updating my distro points...that would be even sexier.
  23. Great articles as always Anyweb. So I was experimenting with what I like to call the "One Package"...One package to rule them all. Is it important that a package only contain updates applicable to the collection it is advertised to, or can it contain updates for other products too? For example: Can a package have updates for Server 2003, 2008, XP, and Vista? I know it "can" but will it only apply those that the client needs? And if I'm deploying newer updates but use the same package again, will it matter if updates that were already installed are still in the package. ie. will it try to reinstall previously installed updates? Last, can you talk a bit about managing the physical file / share level. I don't believe that if I delete a superseded update that it physically removes it. Should that be something we do manually each time we don't need the binary any longer? Or is there some way to clean that up automatically? What about re-using the share? Should all my downloads be in one folder via one share and if so, what parses it? The update list? The deployment package itself? I think I've got most of it down now except for a bit of the philosophy and intent. I know it's flexible, but there are also limitations I'd like to understand better. I'll contribute more as I answer these things myself eventually. But any input and help with strategy is valued. Thanks!
  24. SCCM 2007 SP1 Server 2008 Remote 2005 SQL Overnight, the Admin Console would stop responding on the local server. Reboots resolved it until the next day. I noticed when using WBEMTEST that I was getting memory errors when it occurred. Increased Virtual Memory and it took longer to fail. Suspecting a memory leak, I identified the PID of the WMI SVCHOST and saw it chomping away on half a gig. A bit of poking found this hot fix: http://support.microsoft.com/kb/958124 Problem solved! Posting here since I didn't see it mentioned in the guides. Not sure how others were avoiding this hole. I have green check marks everywhere and a happy console.
  25. For posterity, I wanted to contribute something wacky I've resolved with Microsoft's help. Server 2008 SCCM 2007 SP1 Remote SQL 2005 DB Text book install on pristine hardware. Was getting errors and had several working against me. This KB explained one: http://support.microsoft.com/kb/957879 However the Hot Fix did NOT work. Updated DLL. Same problem Turns out that the Install.MAP file was not correct and an important bootstrap file wasn't called. Once I updated Install.MAP and bounced the Site Component Manager, the sun came out, birds sang, and all was right in the universe. This was not intuitive, was not posted anywhere, and the support rep said that the documentation was internal. Not sure how or why this one hit me, but in case anyone else runs into this here's the line: In the ConfigMgr root (C:\Program Files\Microsoft Configuration Manager folder is a file called Install.MAP. Open this and find the section SMS_SITE_SQL_BACKUP under the BEGIN_COMPONENT_FILELIST section. There should be an entry: FILE <smssqlbkup.exe><1><766496> This tells SCCM the name of the executable, a bit flag for installed <1> or uninstalled <0>, and a 6 digit code for the file. The remote SQL Server needs the srvboot.exe file to well...bootstrap the component remotely. But it wasn't in my Component list. I added the following line after the previous one: FILE <srvboot.exe><0><219904> Bounce the component manager and enjoy life. Though maybe few others will see this, I find it hard to believe anyone else in this environment wouldn't have the same problem. They did agree this was Hot Fix support and did not levy a support incident. Hope that helps some poor frustrated soul somewhere.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.