Still celebrating National IT Professionals Day with 3 months of free Premium Membership. Use Code ITDAY17

x

Microsoft Server OS

55K

Solutions

41K

Contributors

The Microsoft Server topic includes all of the legacy versions of the operating system, including the Windows NT 3.1, NT 3.5, NT 4.0 and Windows 2000 and Windows Home Server versions.

Share tech news, updates, or what's on your mind.

Sign up to Post

Issue: One Windows 2008 R2 64bit server on the network unable to connect to a buffalo Device (Linkstation) with firmware version 1.56. There are a total of four servers on the network this being one of them.

Troubleshooting Steps:
  • Command line: net view \\servername | NOPE
    • System error 53 has occurred, the network path was not found
  • Command line: net view \\192.168.1.x | NOPE (note: ping the actual IP address of the device)
  • Command line: net view /cache | Server is viewable but still get "System error 53"
  • Disabled Firewall | NOPE
    • Services Windows Firewall
  • Disabled Antivirus | NOPE
  • Tried to connect from another Windows 2008 R2 server, OK no issues.
  • Checked Network Settings, IP, DNS, WINS all the same (except the actual IP DUH!)
  • Ping both the NETBIOS name and IP address | OK
  • Clear and Reset: arp, nbtstat, ipconifg (from an elevated command prompt)
    • Elevated command prompt: arp -d
    • Elevated command prompt: nbtstat -R (must be capitalized)
    • Elevated command prompt: ipconfig /flushdns
    • Wait a few moments...
    • Elevated command prompt: nbtstat -RR (must be capitalized)
    • Elevated command prompt: ipconifg /registerdns
  • No errors in either Application log or the System log.
  • Ensured the SMB1 and SMB2 are installed working and enabled.
    • Powershell prompt: Set-ItemProperty -Path "HKLM:\SYSTEM\CurrentControlSet\Services\LanmanServer\Parameters" SMB1 -Type DWORD -Value 1 -Force
    • Powershell prompt:
0
Office 365 Training for IT Pros
LVL 2
Office 365 Training for IT Pros

Learn how to provision tenants, synchronize on-premise Active Directory, implement Single Sign-On, customize Office deployment, and protect your organization with eDiscovery and DLP policies.  Only from Platform Scholar.

Know what services you can and cannot, should and should not combine on your server.
3
How to leverage one TLS certificate to encrypt Microsoft SQL traffic and Remote Desktop Services, versus creating multiple tickets for the same server.
0
The article will show you how you can maintain a simple logfile of all Startup and Shutdown events on Windows servers and desktops with PowerShell. The script can be easily adapted into doing more like gracefully silencing/updating your monitoring system upon planned maintenance reboots etc.
23
 
LVL 14

Author Comment

by:Raj-GT
Comment Utility
Hi Steve,

Thank you for the comment. I am using the InstanceIDs that correspond to the EventIDs I mentioned in the article as it is somewhat faster to execute.
Get-EventLog -LogName System -Source User32 -Newest 10 | ? { $_.EventID -eq "1074"}

Open in new window


Get-EventLog -LogName System -Source User32 -Newest 10 -InstanceId 2147484722

Open in new window


Both commands above will produce the same result.

Thanks,
Raj-GT
0
 
LVL 14

Author Comment

by:Raj-GT
Comment Utility
An updated version of the scripts are now on GitHub at https://github.com/Raj-GT/Windows-Boot-Event-Logging
0
Since the demise of Windows XP each version of the operating system (client and server) have their own unique way of using user profile information. This information applies to all windows operating systems not just the operating systems outlined in the sourced articles
7
 
LVL 8

Expert Comment

by:Senior IT System Engineer
Comment Utility
Thanks David.

I never knew about this before until you've shared this great article :-)

WHat's the impact in sharing or overwriting the user Roaming profile home directory ?
0
 
LVL 83

Author Comment

by:David Johnson, CD, MVP
Comment Utility
WHat's the impact in sharing or overwriting the user Roaming profile home directory ? it is already a share.. so I don't understand the question
0
Understanding the various editions available is vital when you decide to purchase Windows Server 2012. You need to have a basic understanding of the features and limitations in each edition in order to make a well-informed decision that best suits your environment. In this article I will attempt to list some of the elements you need to consider before choosing which edition is most appropriate for your case.

In Windows Server 2012, there are four editions: Foundation, Essentials, Standard, and Datacenter. Don’t let the naming deceive you. While the Foundation and Datacenter editions map directly to the Foundation and Datacenter editions in Windows Server 2008, the Essentials and Standard editions do not. The Essentials edition in 2012 is equivalent to the Standard and Web editions in 2008, while the Standard edition in 2012 maps to the Enterprise edition in 2008. That being said, it should be clear what is the most likely edition you want to choose when it comes to upgrading from your Windows Server 2008 environment. Please note that all Windows Server 2012 editions are 64-bit OS versions. There is no more support for 32-bit operating systems.
 

Foundation Edition

This edition is a scaled down edition of Windows Server 2012. It is intended for small businesses requiring simple file and print services and is limited to 15 users only. It can only be installed on a physical machine; the license for this edition will not allow for a virtual machine install. …
21
 
LVL 83

Expert Comment

by:David Johnson, CD, MVP
Comment Utility
People tend to forget that the virtual machine limitation only applies to the licence of the Microsoft Server Product. The limitation does not apply to other licensed products you could have 1000 windows 7 pro virtual machines (if you have 1000 W7P licenses) or 1000 Ubuntu virtual machines.
0
 
LVL 59

Expert Comment

by:Cliff Galiher
Comment Utility
The description of Essentials is incorrect. Essentials is *not* analogous to standard in 2008, but is like essentials 2011. It is a unique product with special features (client backups) and unique limitations (*must* be a domain controller.)
0
Many of us in IT utilize a combination of roaming profiles and folder redirection to ensure user information carries over from one workstation to another; in my environment, it was to enable virtualization without needing a separate desktop for each user. It's also an effective way to make sure users who give no thought as to the proper place to save a document don't lose data, and to prevent data from being stolen or lost in the event a workstation dies or is stolen.

However, the biggest battle I have fought in maintaining our new "cloud" environment has been the Saved Games folder. I'm unsure which of the Microsoft security gurus decided this folder needed to be even more tightly protected than the .net Framework folder, yet I am constantly having users call me with error messages on the screen to say they can't reach their AppData folder because the Saved Games folder has messed up the permissions inheritance in the user's profile. 

When I check the permissions, all is fine - the appropriate accounts have the proper permissions, and the user is listed as the owner of the profile folder. But enforce ownership or inheritance on subcontainers and more often than not the message, "You do not have permission to read the contents of Saved Games. Do you want ... Full Control?", appears.  Of course, answering 'yes' accomplishes nothing as not even the server or domain administrator accounts can do anything with this folder without expressly taking ownership of Saved Games
0

People frequently reach out to me to assist with outages in their Exchange environments.  It's alarming how many Exchange Engineers/Administrators hold current Microsoft certification, yet continue to make these mistakes over and over.  Here are some tips you should follow when deploying an Exchange Mailbox server.

 

  • Place Exchange Database and Logs on Thick Volumes
Never use thin provisioned disks in Exchange. When a thin provisioned volume is added to a server, the server operating system has no insight into the storage array.  Why does this matter?  It matters because the storage array is only allocating what is being used.  As Exchange data is processed, the storage has to zero-out the block and reserve it first before the transaction can be written.  This is a performance bottleneck for Exchange, especially in large environments.  No high I/O applications should ever use thin provisioning.  Always be specific about your storage needs if another team handles your storage requests.  If you don’t have access to your storage array, request a screen-shot from them when they provision storage for you.  Always validate Exchange builds with a catalog of validation documents. 
  • Follow Best Practice
Always obtain best practice documentation from all your vendors.  Sometimes you will run into situations when you have contradictory best practice guidelines.  Most of the time you should use the vendor’s …
13
 

Expert Comment

by:Rik Tammegat
Comment Utility
About storage... my pet peeve :)

I agree with your recommendation about full provisioning for log disks. These grow and shrink periodically, especially when using log backups. The storage array therefore has to allocate and deallocate constantly, which only adds overhead. I do, however, want to put up a side note for thin provisioning with the database disks. Some storage systems detect "zero-outs" on thin provisioned LUNs. The effect is that the writes are acknowledged to the Exchange servers, but won't get written to disk. Result is a faster "zero-out" than with full provisioning.

Bottom line: Know your storage box.
0
 
LVL 8

Expert Comment

by:Senior IT System Engineer
Comment Utility
Thanks for sharing RKluxen,

I couldn't agree more on those points that you've shared above.

What about:

1. set the swapfile to static size ?
2. disabling IPv6
3. skipping the .NET framework update for all server running Exchange server.
0
So you have already heard all the reasons why self-signed certificates are not ideal, and you are ready to take the next step. You have added a Windows Enterprise Certificate Authority (CA) to your environment, and you are ready to start issuing certificates, but are now stuck. Really, I’m my own target audience for this article. In the past, I’ve found myself in need of certificates, and while creating templates on the CA is relatively easy, requesting those certificates and getting them in the right place has been harder. This article is the compilation of my notes from those past experiences, hopefully a little more readable and a lot more accessible.

There are good articles that describe how a PKI Infrastructure works, and it would be a good idea to be familiar with the basics before we get started. In short, a certificate has a public part and a private part. Something encrypted with the private part is digitally signed and can be confirmed using the public part of the certificate. Something encrypted with the public public is considered encrypted, and can only be decrypted with the private part. We want to create a new certificate and get our CA to sign the certificate, indicating that if the CA is trusted, our new certificate should be trusted too.

When a certificate is needed (that is not automatically created and enrolled invisible in the background), I usually create certificates on my Windows Servers with the certreq
2

Background Information


Recently I have fixed file server permission issues for one of my client. The client has 1800 users and one Windows Server 2008 R2 domain joined file server with 12 TB of data, 250+ shared folders and the folder structure is five levels deep. All shared folder access is granted on per user basis and no groups are defined, causing the folder access control list (ACL) to become exhausted.


The file server is part of one domain and since they have acquired another company, we have to grant the second company's users (another domain) appropriate rights to the file server data. The domain level trust is already in place.

The problem:


For many folders, administrators don’t have even read access and can’t even check folder ACL. They are unable to see the folder owner and are unable to access the folder as well and hence they are unable to handle file server access.


For example:

Folder-Access-1.jpgFolder-Access-2.jpgI went to folder properties, and it shows me that folder is empty, when in reality the folder is not empty, but I don’t have permission to view folder size.

Folder-Access-3.jpgI don’t have access to view the folder NTFS permissions, but I am able to view share permissions, and share permissions are full control for everyone.


I am even unable to see folder owner:

Folder-Access-4.jpgThe administrator can take folder ownership forcefully with the replace permissions option, but this will destroy existing file server permissions, which is not desirable.

Folder-Access-5.jpgIf I click here Yes now, all existing permissions will be destroyed by granting me full control (in addition to ownership) which is not the objective here. I have to click on No by force. I immediately got the following warning messages:

Folder-Access-6.jpg 

 

Folder-Access-7.jpg  

Unless I get folder ownership, I can’t add or modify anybody or myself on the folder access control list.

 

The root cause of this problem is that multiple users have Full Control NTFS permissions on the root folder. Some smart users have removed the built-in administrators group from the access control list and from the owners tab. The Creator Owner group is listed on the ACL of folders, as a fact the person \ user who creates files and folders automatically becomes the owner of those files and folders. The permissions model became complicated. User level access is granted instead of groups, which is difficult to track.


NTFS Folder ownership


  • Every object has an owner, whether the object is in an NTFS volume or in Active Directory Domain Services (AD DS). The owners can controls how permissions are set on the object and to whom permissions are granted.
  • An administrator who needs to repair or change permissions on a file must begin by taking ownership of the file if he does not have already.
  • By default, the owner is the entity that created the object. The owner can always change permissions on an object, even when the owner is denied all access to the object.

Ownership can be taken by

  • By default, the Administrators group is granted the Take ownership of files or other objects user right.
  • Any user or group who has the Take Ownership permission on the object.
  • A user who has the Restore files and directories user right.

Ownership can be transferred in the following ways:

  • The current owner can grant the Take Ownership permission to another user. The user must actually take ownership to complete the transfer.
  • A member of local administrators group can take ownership.
  • A user who has the Restore files and directories user right can double-click Other users and groups and choose any user or group to assign ownership to.

 

CREATOR OWNER

Folder-Access-8.jpgIf you look at above diagram, there is special group called CREATOR OWNER. This group is getting inherited from drive root and because of this group, the person who creates files and folders is automatically assigned ownership of those files and folders as long as this group is listed on the ACL.

 

I have shared folders with size from 10GB to 250GB; I need some method to take ownership of all folders without destroying existing folder permissions.

There are TWO options left:

Either I take folder ownership from top to bottom without destroying existing permissions

OR

I need some user who already has got full control permissions on folder who can grant my admin account access to folder and from there I can take it ahead. There are multiple free tools available on the internet to accomplish this. Membership in the server local administrator group is the minimum prerequisite to use any tool.


Takeown – Built-in tool available in Windows-based systems for managing folder ownership


Takeown has its own limitations and can destroy existing NTFS permissions in addition to take folder ownership. In order to take ownership with the Takeown utility without destroying existing permissions, you must have read permissions at least on folder and files; otherwise you cannot take ownership. So the verdict is until you get ownership of all sub folders and files you have to run below TWO commands one by one again and again.

takeown /f <directory path> /r /a
                                        where
                                        /f stands for file \ folder
                                        /r stands for recursive
                                        /a stands for administrators group
                                        
                                        AND
                                        
                                        Icacls <Directory Path> /grant administrators:f /t
                                        /t switch will take care of sub folders and files
                                        f stands for full control permission
                                        
                                        Example:
                                        takeown /f C:\TFolders /r /a
                                        Icacls C:\Tfolders /grant administrators:f /t

Folder-Access-9.jpgIn above example Takeown has assigned ownership of the "C:\TFolders" folder root to only the administrators group, even you specify /r switch for recursive ownership because you do not have read permissions to subfolders and files. If you press Y in above command when prompted, all folder permissions will be destroyed and only your admin account would granted full control permissions. You can specify additional /D switch with Y OR N parameter to suppress every permission replacement prompt. You have only ownership of root folder; you still don’t have any ownership of subfolders, nor any permission on the root folder or subfolders.


This is the same case when you try to take folder ownership from the GUI in recursive mode:

Folder-Access-10.jpgIn the above snapshot, if you select yes, it will destroy existing folder permissions by granting you full control in addition to ownership.

 

Now that you have ownership of root folder, you need to run below command with the Icacls Windows built-in utility to grant administrators full control. This utility will grant administrators full control on root folder only because you don't have ownership of rest of subfolders and files yet.

Folder-Access-12.jpgAgain you have to run Takeown utility to take ownership of further subfiles and subfolders since you have access to the root folder.

Folder-Access-13.jpgOnce you have ownership of further folders, again you need to assign permissions with the Icacls utility as shown below.

Folder-Access-14.jpgIn above diagram still there is one access denied error.You need to run both commands multiple times until you get ownership and access of entire folder. Then you can manage all aspects of that folder.

  

Subinacl – Free utility available from Microsoft

 

SetACL and Subinacl are very powerful tools and can do much more than Takeown. I prefer these tools over Takeown utility. The major advantage of these tools is that they can take ownership of entire folder, including subfolders and files regardless of access permissions in one shot without destroying existing permissions, even if you don’t have read permissions on the folder root, subfolders and files.


Syntax of command: 

Syntax:
                                        Subinacl /noverbose /Subdirectories <Directory Path> <action parameter>
                                        
                                        Ex:
                                        To take ownership of folder root:
                                        Subinacl /noverbose /Subdirectories F:\Projects\1016120 /setowner=administrators
                                        If folder name having spaces in name:
                                        Subinacl /noverbose /Subdirectories "F:\Projects\My IMP Data" /setowner=administrators
                                        
                                        To take ownership of all sub folders and files underneath root folder:
                                        Subinacl /noverbose /Subdirectories F:\Projects\1016120\ /setowner=administrators
                                        If folder name having spaces in name:
                                        Subinacl /noverbose /Subdirectories "F:\Projects\My IMP Data\*" /setowner=administrators
                                        
                                        To grant administrators full control on folder root:
                                        Subinacl /noverbose /Subdirectories F:\Projects\1016120 /grant=administrators=f
                                        If folder name contains spaces:
                                        Subinacl /noverbose /Subdirectories "F:\Projects\My IMP Data" /grant=administrators=f
                                        
                                        To grant administrators full control on all subfolders and files underneath folder root:
                                        Subinacl /noverbose /Subdirectories F:\Projects\1016120\ /grant=administrators=f
                                        If folder name contains spaces:
                                        Subinacl /noverbose /Subdirectories "F:\Projects\My IMP Data\*" /grant=administrators=f


The example below shows how to take folder ownership and access with Subinacl tool. The tool can take ownership of all subfolders and files including root folder and can grant full control access to the built-in administrators group without destroying any existing permissions.

Folder-Access-15.jpgThe Subinacl utility gives you one additional facility that allows you to back up NTFS security along with ownership on entire folder before making any chnages. In case you make a mistake during taking folder ownership or modifying folder access control list, you can restore entire NTFS access control list.


Syntax of command:

Subinacl /noverbose <action parameter> /subdirectories <Directory path>
                                        
                                        To backup NTFS permissions of root folder:
                                        Subinacl /noverbose /output=C:\TFolders_Root.txt /subdirectories C:\TFolders
                                        If folder contain spaces:
                                        Subinacl /noverbose /output=C:\MyData_Root.txt /subdirectories "C:\My Data"
                                        
                                        To backup NTFS permissions of all sub folders and files underneath root folder:
                                        Subinacl /noverbose /output=C:\TFolders_Child.txt /subdirectories C:\TFolders\
                                        If folder contain spaces:
                                        Subinacl /noverbose /output=C:\MyData_Child.txt /subdirectories "C:\My Data\*"
                                        
                                        To restore NTFS permissions on folder root:
                                        Subinacl /noverbose /playfile C:\TFolders_Root.txt
                                        
                                        To restore NTFS permissions on sub folders:
                                        Subinacl /noverbose /playfile C:\TFolders_Child.txt
                                        
                                        The 1st command will restore security on root folder (C:\TFolders)
                                        The 2nd command will restore security on all subfolders and files underneath folder root (C:\TFolders\*)

For example:

Folder-Access-18.jpgThe Subinacl command line reference help file is attached here subinacl.zip


SetACL

The command line version is freeware. There is no need to install as it is a standalone .exe file. Download it, and use it from elevated command prompt. This utility also works great like Subinacl, capable of taking folder ownership and granting folder access without destroying existing folder permissions.


Syntax of command: 

SetAcl -on <Directory Path> -ot <object type> -actn <parameter> -rec cont_obj -silent
                                        Where
                                        -on stands for "object name",the name of directory
                                        -ot stands for "Object type"
                                        -actn stands for action to be performed, setting up owner (setowner) in our case
                                        -rec stands for recursive action, to be carried out on all sub folders and files (cont_obj)
                                        -silent no output will be printed on screen.
                                        
                                        Ex:
                                        To set owner on entire folder:
                                        SetAcl -on C:\TFolders -ot file -actn setowner -ownr n:administrators -rec cont_obj -silent
                                        If folder name contain spaces:
                                        SetAcl -on "C:\My Imp Data" -ot file -actn setowner -ownr n:administrators -rec cont_obj -silent
                                        
                                        To grant administrators group full control on entire folder:
                                        SetAcl -on C:\TFolders -ot file -actn ace -ace "n:administrators;p:full" -rec cont_obj -silent
                                        If folder name contain spaces:
                                        SetAcl -on "C:\My Imp Data" -ot file -actn ace -ace "n:administrators;p:full" -rec cont_obj -silent

For example:

Folder-Access-16.jpgThe above command will assign entire folder ownership to the built-in administrators group and will grant full control access permissions without destroying any existing folder permissions. You can refer SetAcl online command reference for more information: https://helgeklein.com/setacl/documentation/command-line-version-setacl-exe/


Some best practices about setting up standard share folders to minimize management efforts:


  1. Always share folder with everyone full control share permissions.
  2. Control user access over NTFS access control list.
  3. In order to control user access over NTFS permissions, disable inheritance from advanced NTFS security page on the root share folder.
  4. Avoid granting users full control NTFS permissions on root shares and subfolders unless absolutely necessary.
  5. Ensure that the server local administrators group has full control NTFS permissions on the root share and has root folder ownership as well. Never grant individual administrator full control NTFS permissions.
  6. Remove the Creator owner group from root share. This is the main culprit that can cause most of folder ownership and access issues. This will ensure that individual users never get subfolder and files ownership.
  7. Try to avoid granting deny permissions to users or groups on the NTFS access control list.
  8. Avoid granting permissions to individual users on shared folder access control list as far as possible.
  9. Instead of adding individual users on access control list, create global security groups and add required users to them, and grant these security groups appropriate rights on access control list.
  10. The process to setup roaming profiles is bit different than above; by default these folders are not accessible to administrators. However you can apply group policies in advance on the server where you want to store roaming profiles so that built-in administrators group can have access to roaming profile folders if necessary. The GPO setting "Add the administrator’s security group to roaming user profiles" can be found under Computer configuration => Administrative templates => System => User profiles. A great article is already published on the TechNet blog to set up Roaming Profiles \ home directories: http://blogs.technet.com/b/askds/archive/2008/06/30/automatic-creation-of-user-folders-for-home-roaming-profile-and-redirected-folders.aspx
  11. Another option is to take complete roaming profile share ownership with SetACL OR Subinacl without destroying existing ACL, and then add the administrators group to the roaming profile root share. That will eventually be inherited by subsequent profile folders.
9
 
LVL 37

Author Comment

by:Mahesh
Comment Utility
Thanks

There is slight difference between authenticated users and everyone
Everyone group contains Guest, IUSR & the IWAM accounts in addition to authenticated users \ domain users in trusted domains and forests
Previously anonymous users are part of everyone group, but with 2003 AD, it is removed

The Authenticated Users group includes all users whose identities were authenticated when they logged on. This includes local user accounts as well as all domain user accounts from trusted domains and forests
Authenticated users do not contains guest, ISSR, IWAM, Anonymous, local service and network service accounts.
Normally these accounts cannot logon to any machine to access shared resources and guest account is disabled by default unless you enable it

As a fact I really do not see noticeable difference between TWO, however you may use authenticated users instead of everyone
The major permissions control remains on NTFS permissions

Probably we need to disable UAC, otherwise it will unnecessarily prompting, in some organizations they have policy to keep UAC enabled

Normally I do want to clear Creator owner from share folder root at beginning, you can remove it from drive root, however I don't think it is required.

I observed on 2012 and above servers, If you are server administrator and if you trying to open share folder for which you don't have access on NTFS ACL, and you tried to access it through local path, it will prompt you popup so that you can click on continue and you will get access.
0
 

Expert Comment

by:Gaurav Chauhan
Comment Utility
Many thanks for this detailed article this subinacl tool is just awesome far better than icacls,solved my greatest problem,  now I am surprised why this tool is mentioned nowhere this should be promoted as built in tool by Microsoft . Many thanks again.
1
Office 365 Training for Admins - 7 Day Trial
LVL 2
Office 365 Training for Admins - 7 Day Trial

Learn how to provision tenants, synchronize on-premise Active Directory, implement Single Sign-On, customize Office deployment, and protect your organization with eDiscovery and DLP policies.  Only from Platform Scholar.

A while back I was asked by our Software Asset Manager if they could be alerted on specific applications that have been installed recently, focusing on things like SQL, Visual Studio and Exchange. With it being our True-Up time I decided to have a look at how we could integrate the “Software Installed in X days” query into Orchestrator 2012.

It is a simple runbook using the query criteria and then outputting the data to a *.csv file and then mailing that off to the SAM.

Below are some of the basic steps

Application Install Alerts Runbook Activities
I have used a “Monitor Date/Time” activity at the start but you could also use “Task Scheduler” which might be better as your runbook will then not be running all the time but time wasn’t on my side :) This runbook will fire off every 7 days, this criteria also being one of my filters in the SQL query.

The “Check File Exists” and “Delete File” are merely there so that a new file is generated every time it is mailed off giving a cleaner file when the guys need to go through the list.

The “Run SQL Query” activity,  I have added a predefined filter of applications that we are interested in and want alerting on.

As you can see, this is a simple but affective runbook and can be changed to add many other activities like logging a call, auto removal and many more.

I have also recently added a "Get installers Email" activity that will also mail the user who installed this software that we are tracking and ask them to please mail the …
0
If I am using HyperV - Core edition as hyper visor then what options are available to manage VM from remote ?

How are you going to manage it from your desktop PC? You do not want to have to use Remote Desktop Protocol (RDP) to connect to the server and launch the Hyper-V manager, every time that you want to administer Hyper-V. Thus, you need the Hyper-V tools for remote management up and running whenever you need them

What about if i am not using domain environment. (what permission required to authenticate both machine which each other)

So what will my scenario prerequisite:

A client computer that is running Windows 7, and that is connected to the same network where the virtualization server is connected (both computers in a workgroup or both in a domain).


You can install Hyper-V Manager on a Window 7 machine , and from that computer, you can manage the virtual machines that are running on your virtualization server. The user experience is the same as that of Hyper-V Manager running on the virtualization server.



Download the Remote Server Administration Tools (Windows 7 Professional or Ultimate only)


    On your Windows 7 Download the correct version of the tool from
http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=7887

   There is a 32 bit (Windows6.1-KB958830-x86-RefreshPkg.msu) version and a 64 bit (Windows6.1-KB958830-x64-RefreshPkg.msu)

   Install the application.

Create the same
8
 
LVL 2

Expert Comment

by:SKY 75
Comment Utility
Good explanation,  just like experts...
0
 

Expert Comment

by:Ad Vee
Comment Utility
Does not work for Server 2012 R2 <-> Win 7
0
A step-by-step introduction to multipathing, what it is, and how to set it up using iSCSI initiators with MPIO in Windows Server or desktop operating systems.
10
 
LVL 4

Expert Comment

by:aa-denver
Comment Utility
Great article.
0
SCCM 2012 Application Approval Process Problem

I am not sure how many of you use the new feature in SCCM called “Application Catalog” to allow users to request & install software.

Well we have implemented it to control what software gets installed & allow us to get users to request software.

Now one problem we had with this was that we would never know that a user has requested an application for approval unless we constantly refreshed the “Approval Requests” tab in the SCCM console. This was a big issue for us as we needed our SAM to be notified when new requests were made & then to be able to approve without having to go into the console to approve.

This is the solution we came up with


Systems Used

•      SCCM 2012 SP1
•      Orchestrator 2012 SP1
•      Visual Studio 2010
•      HP Service Manager
•      Active Directory


Thanks To the following people:
•      For the integration into our HP system I thank our local HPSM guys
Chris.Visagie
•      For providing the outline of the runbooks which contained the PowerShell scripts as well as an idea for the ASP.Net page.
Neil Peterson  - http://blogs.technet.com/b/neilp/archive/2012/09/25/configuration-manager-application-request-notification-and-approval-solution.aspx
As you will see in my article I have changed the layout a bit to suit my needs & my environment.
•      For providing the SCORCH IP’s
http://technet.microsoft.com/en-us/library/hh295851.aspx


Firstly what we did was import the needed IP’s into …
3
 

Expert Comment

by:James Avery
Comment Utility
Leon,

Have you finished the "We are also working on auto quoting workflow if the user does not have a license which would then be incorporated into the “CM Approval Mail” runbook & will give the SAM an extra option to select something like “No License”?

I'm working on creating one like this as well and I wanted to know if you have completed it and if I can borrow the code?

I want to also add the "Optional Reference" field in the application to signify if the application requires a license or not. Thoughts?

James
0
 
LVL 8

Author Comment

by:Leon Taljaard
Comment Utility
Hi James

Sorry for the late reply, it has been madness :)

I have not had a chance to get going with the Application Approval work flow because I have been focusing on some other urgent requirements that were needed like New Infrastructure Requests/Quotes, New Drive Backup Requests, Telephone Requests and now currently New Desktop/Laptop and personal Equipment requests.

Now all of the above work from a custom web page where the user inputs their details and either gets the quote directly on the page dynamically or requests the quote from our Facilities team and then gets the quote back for approval before order. Really very cool, and the back end is all SCSM, Orchestrator and some custom DB's :)

What I was thinking about is looking at taking our local DB which our software asset manager currently looks at and then every time there is a request for an application then I reference that DB and if they are not found then it would send them a link or a price for that requested application where they would either use the link to say approve and be required to fill in a form with their cost code and so on and then only approved afterwards.

I have so many ideas on it but it is just to get it going, I know this doesn't help you much right now but if you would like more help or ideas let me know and I would be more than willing to help anytime, it might be a little delayed but I will always reply :)

If I get this quoting going for the application approval I will most definitely share this with you.

Let me know

Thanks
0
This tutorial will go through the steps required to promote a Windows 2012 server to a domain controller in an existing Active Directory Forest / Domain. It is important to note that dcpromo (the tool used to promote / demote domain controllers in previous Windows versions) has been deprecated and has been replaced by the Active Directory Domain Services Configuration wizard.

The following assumptions have also been made:

You have an existing forest in place
The Forest Functional Level is at least Windows 2003 or higher in order to introduce a Windows 2012 domain controller
You have installed and configured DNS
You are either a member of Domain Admins / Enterprise Admins / Schema Admins (as appropriate) or have been delegated the relevant rights
Microsoft best practice will be followed in order to split the database and logs onto different spindles

1. Check Forest & Domain Functional Levels


To check the existing Forest / Domain Functionality Level you can use Powershell. You can install the Active Directory module by using any of the following methods.

On a Windows Server 2008 R2 or Windows 2012 Server when you install the AD DS or AD LDS server roles
When you make a Windows 2008R2 or a Windows 2012 server a domain controller by running dcpromo.exe / AD DS Configuration Wizard
As part of the Remote Server Administration Tools (RSAT) on a Windows Server 2008R2 or a Windows 2012 server
7
 
LVL 16

Author Comment

by:Raheem05
Comment Utility
It certainly should not take 3 hours, have you check the debug logs?

They are located at:

%systemroot%\debug\dcpromo.log

%systemroot%\debug\dcpromoui.log

Raheem
0
 

Expert Comment

by:JPD153
Comment Utility
Will this procedure work for migrating AD from SBS 2003 to Win server 2012 R2 in VM mode ?
0
If you want to replace a server which holds file shares like homes or profiles you have to find a way to move them to the new server without taking ownership of these folders.

If you are running Microsoft Windows Server 2003 R2 or above, there is an easy way to realise this. (Server 2003 does not include the needed Replication service). It is called Distributed File System.

Prepare the old Server:

-Click start and open "Manage Your Server"
You can find a section called "File Server"

-Click Upgrade this role
The upcoming wizard shows all possible components to add.

-Select Replicate data to and from this server (Installs DFS Replication Service)
And click next.

The wizard will now install the Distributed File Service Replication on this server. If it will ask for the R2 components, insert Windows Server 2003 R2 CD2. After installation completed, reboot the server.

On the new Server 2008 R2 server

- Open Server Manager
- Click Add Role and select File Services
- Check Distributed File System and install the role.
- Reboot the Server.

(If you already installed File Service role, choose Add Role Service and check Distributed File System)


Navigate to Start-Administrative Tools-DFS Management

-Rightclick Replication and choose New Replication Group

-Check Multipurpose replication group (should be checkt by default) and click Next

-Type a Name for the replication group. (It  doesn't matter it's only for you to …
0
This tutorial will walk through a few ways to restart / shutdown Windows 2012 servers, we will also go through the charm and start options.

Windows Server 2012 uses a Start screen, Start is a window not a menu, programs can have tiles on the Start screen. Tapping or clicking a tile runs the program. When you press and hold or right click on a program an optional panel is displayed.

The charms bar is an optional panel for Start, Desktop and Server Settings, with a mouse and keyboard you can display the charms by moving the mouse pointer over the hidden button in the upper-right or lower-right corner of the Start, Desktop or Server Settings or by pressing the following keyboard combination Windows Key + C

Windows + C
One way to quickly open a program is by pressing the Windows Key and typing the program name, and then pressing Enter. This shortcut works as long as the Apps Search box is in focus which typically by default it is.

Pressing the Windows Key toggles between the Start screen and the desktop or if you are working with Server settings then between the Start screen and the Server Setting. You can also display the desktop by using the following keyboard shortcut Windows Key + D:

Windows + D
You can access Control Panel from Start, from your desktop by accessing charms, click on settings and click Control Panel additionally as Windows Explorer is pinned to the taskbar you can launch Windows Explorer and simply type Control Panel followed by Enter.

Settings Control Panel
Explorer Control Panel

1. Shutdown & Restart using Power Settings Method


15
 
LVL 18

Expert Comment

by:Akinsd
Comment Utility
Additional methods from Run Command or Command Line

Launch Run Window
Windows Logo + R
Type shutdown

You can add other switches
Usage: shutdown [/i | /l | /s | /r | /g | /a | /p | /h | /e | /o] [/hybrid] [/f]

    [/m \\computer][/t xxx][/d [p|u:]xx:yy [/c "comment"]]

    No args    Display help. This is the same as typing /?.
    /?         Display help. This is the same as not typing any options.
    /i         Display the graphical user interface (GUI).
               This must be the first option.
    /l         Log off. This cannot be used with /m or /d options.
    /s         Shutdown the computer.
    /r         Full shutdown and restart the computer.
    /g         Full shutdown and restart the computer. After the system is
               rebooted, restart any registered applications.
    /a         Abort a system shutdown.
               This can only be used during the time-out period.
    /p         Turn off the local computer with no time-out or warning.
               Can be used with /d and /f options.
    /h         Hibernate the local computer.
               Can be used with the /f option.
    /hybrid    Performs a shutdown of the computer and prepares it for fast star
tup.
               Must be used with /s option.
    /e         Document the reason for an unexpected shutdown of a computer.
    /o         Go to the advanced boot options menu and restart the computer.
               Must be used with /r option.
    /m \\computer Specify the target computer.
    /t xxx     Set the time-out period before shutdown to xxx seconds.
               The valid range is 0-315360000 (10 years), with a default of 30.
               If the timeout period is greater than 0, the /f parameter is
               implied.
    /c "comment" Comment on the reason for the restart or shutdown.
               Maximum of 512 characters allowed.
    /f         Force running applications to close without forewarning users.
               The /f parameter is implied when a value greater than 0 is
               specified for the /t parameter.
    /d [p|u:]xx:yy  Provide the reason for the restart or shutdown.
               p indicates that the restart or shutdown is planned.
               u indicates that the reason is user defined.
               If neither p nor u is specified the restart or shutdown is
               unplanned.
               xx is the major reason number (positive integer less than 256).
               yy is the minor reason number (positive integer less than 65536).
0
 

Expert Comment

by:Kimo Yoshida
Comment Utility
Excellent!
0
Remote Apps is a feature in server 2008 which allows users to run applications off Remote Desktop Servers without having to log into them to run the applications.  The user can either have a desktop shortcut installed or go through the web portal to launch applications.  The reason why IT administrator would apply this approach for running applications is end user experience.

The end users now have to simply click a shortcut on his/her desktop and the application begins to run.  To the user it looks as if the application is installed locally.  Also any printers that the user has get redirected.  If they have dual screens they can move the application around between screens just like if it was locally installed.  The bonus for IT administrators is that they can control who has access to an application.

Below are the steps to install Remote Desktop Services on one server with Remote App functionality.  These are teh basic steps to get teh server going so you can play with the features.  

1. Install server 2008 R2

2. From Server Manager install the role Remote Desktop Services
 Roles required for RDS
3. Install the Remote Desktop Session Host and the Remote Desktop Web Access Services for the role. (IIS will be required for this but it will install for you)
 Services Required
4. Now at this point you have all the core services and roles installed to get going.  We did not install services such as gateway, or licencing as those are not required to get …
4
 
LVL 58

Expert Comment

by:tigermatt
Comment Utility
jessiepak,

This is a good article. I've voted "Yes" above to indicate this was useful and to thank you for taking the time to put this together. Thanks for your efforts!

-Matt
0
 

Expert Comment

by:a245439
Comment Utility
jessiepak, great article, however I have a question.  Do I have to publish my apps via the Web Interface in order for them to be accessible via a server farm or can I use either the .rdp file or the .msi install and still use them in a server farm.

All my research only talks about using the web interface in a server farm with remote apps.
0
Some time ago I faced the need to use a uniform folder structure that spanned across numerous sites of an enterprise to be used as a common repository for the Software packages of the Configuration Manager 2007 infrastructure.

Because the procedure was relatively lengthy and the final outcome was extremely good, i started compiling a guide that must be used during the setup of a DFS structure for Software Distribution via the Configuration Manager and bundled it with the Office 2007 deployment package instructions.

The instructions were written for a Windows 2008 Server but can be used on Windows 2003 R2 and Windows 2008 R2 Servers.

Keep in mind that this is not the official recommended way and Microsoft doesn’t support it.

You may download the file from my SkyDrive by clicking the following linked title: Deploying software that resides in DFS Folders via Configuration Manager 2007

Here's an excerpt from the document:

Why use DFS as your software repository

Consider the following scenario, you have one Primary/Central Site server with two (2)
secondary servers that connect with a saturated or bandwidth constraint WAN Link,
control of the consumed bandwidth is at high priority.
  •  You need one logical place to access the data from any of the three sites and always
access the closest one.
  • You want to quickly move the data to another drive when hard drive space is limited
2
 

Expert Comment

by:CPA_TDA
Comment Utility
The link for : Deploying software that resides in DFS Folders via Configuration Manager 2007 does not work
0
 
LVL 7

Author Comment

by:George Simos
Comment Utility
I apologize for the link error, however I've triple checked the tiny-url link before submitting the article and all was fine.
I've recreated the tiny-url link and it is ok now.
0
Comprehensive Backup Solutions for Microsoft
LVL 4
Comprehensive Backup Solutions for Microsoft

Acronis protects the complete Microsoft technology stack: Windows Server, Windows PC, laptop and Surface data; Microsoft business applications; Microsoft Hyper-V; Azure VMs; Microsoft Windows Server 2016; Microsoft Exchange 2016 and SQL Server 2016.

Have you considered what group policies are backwards and forwards compatible?

Windows Active Directory servers and clients use group policy templates to deploy sets of policies within your domain. But, there is a catch to deploying policies. The policy templates are not forwards compatible with the latest operating systems. In other words, if you use a 2003 server or 2003 R2 server, you cannot administer group policies to a Vista or Win7 computer.

CAUSE:
Windows 7 and Vista use ADMX admin templates for group policies. The 'latest' legacy machines (meaning XP, 2003 server, and 2000) use ADM templates for group policy.

SYMPTOMS:
If you try to administer policies for a Win7, Vista, or 2008 server from a 2003 server, you will probably see problems associated with Win 7 machines. The symptoms I am seeing on Experts Exchange, when helping administrators include:

Slow logons
Group Policy core failures
Group Policy not applying to Vista and Win 7 computers
Group Policy event log errors on the clients and server

Example:
Group Policy Core Failure and Win 7 computer take 6-10 minutes to logon

POTENTIAL FIXES:
1) Of course, you can deploy a 2008 server. Some businesses do not have the budget.
2008 servers can provide policies for ADM and ADMX templates.

2) A work around was found and outlined on this very-well written article:
Author: Mark Menges
"Supporting Windows 7 Group Policy Settings with Windows Server 2003 Domain Controllers"
8
Welcome to my series of short tips on migrations. Whilst based on Microsoft migrations the same principles can be applied to any type of migration.

My first tip Migration Tip #1 – Source Server Health can be found here: http://www.experts-exchange.com/OS/Microsoft_Operating_Systems/Server/A_6234-Migration-Tip-1-Source-Server-Health.html

My second tip Migration Tip #2 – The Practice Run can be found here: http://www.experts-exchange.com/OS/Microsoft_Operating_Systems/Server/A_6235-Migration-Tip-2-The-Practice-Run.html

So, we now have a healthy source server and you have practiced until the match sticks snap what next?

My third tip is about making sure you are prepared for the task ahead. Any type of migration needs to be taken seriously.  It is a business critical operation you are about to embark on. If you are in any doubt about it at all, now is the time to say so, and if necessary call in help.

If you are happy with the process and confident you are able to complete the steps to achieve your goal then the next thing we need to do is plan a time to do it.

Most migrations are not time limited other than SBS to SBS migrations that have a limit of 21 days where both SBS servers can co-exist at the same time.

Make people aware of what you are doing.  Involve them, explain that you are expecting to have teething problems but would prefer if they collated them and then passed them to you when you ask for them. The last thing you want is to try trouble …
1
Welcome to my series of short tips on migrations. Whilst based on Microsoft migrations the same principles can be applied to any type of migration.

My first tip Migration Tip #1 – Source Server Health can be found listed in my profile here: http://www.experts-exchange.com/OS/Microsoft_Operating_Systems/Server/A_6234-Migration-Tip-1-Source-Server-Health.html

My second tip is about making sure you are familiar with the technology you are migrating to.

For many people, migrating to a new technology will be the first and only time they perform this task. So, it’s always a good idea to familiarise yourself with the setup process before you do it for real.  With the use of virtualisation technologies we can install and test new products without the need for new hardware and without the possible impact on our live environment.

There are a number of virtualisation products that will allow you to do this on your desktop/laptop computer. You need to consider that most new products (if not all) will be based on x64 bit architecture. This does limit the virtualisation technologies that you can use on the desktop. Some of my favourites are listed below.

VMWare Workstation, this is a paid product but worth its weight in gold: http://store.vmware.com/store/vmwde/en_IE/pd/productID.166452200/Currency.GBP/?src=PaidSearch_Google_PersonalDesktop_WKSN_EMEA_UK_EN_Brand
VMWare Server, this is free for use and technically should only be used on a Server Operating System, but it does…
1
Welcome to my series of short tips on migrations. Whilst based on Microsoft migrations the same principles can be applied to any type of migration.

My first tip is around source server preparation.

No migration is an easy migration, there is always potential for something to go wrong. All we can do is try to minimize this risk.

The biggest risk comes from the system we already have in place, the integrity of this system is paramount in ensuring a successful migration.

Making sure your source system is healthy and configured correctly will go a long way to ensuring you have a smooth migration.

Use analyzers and health check tools that are available from the vendor. Microsoft, for example, have a number of best practice analyzer tools. These can be used to identify any problems the system may have and provide advice on how to resolve them. Some of the ones I use regularly are listed below:

Small Business Server 2003 BPA: http://www.microsoft.com/downloads/en/details.aspx?FamilyID=3874527a-de19-49bb-800f-352f3b6f2922&displaylang=en
Small Business Server 2008 BPA: http://www.microsoft.com/downloads/en/details.aspx?familyid=86a1aa32-9814-484e-bd43-3e42aec7f731&displaylang=en
Exchange Server BPA (not for Exchange 2007 or 2010, the built in BPA should be used): http://www.microsoft.com/downloads/en/details.aspx?familyid=dbab201f-4bee-4943-ac22-e2ddbd258df3&displaylang=en
Internet Security and Acceleration Server BPA: …
2
On a regular basis I get questions about slow RDP performance, RDP connection problems, strange errors and even BSOD, remote computers freezing or restarting after initiation of a remote session.

In a lot of this cases the quick solutions made by the customers are even more worse than the problem it self: 3rd party remote solutions without encryption, authentication or any care about Information privacy. I don't like it when things like this happen, because changing to other solutions without any benefit is bad for our business.

What I will to show other people in this article is, that RDP is something you have to care about in the sense of putting more work and thoughts into it's further development, because it's THE remote tool of a windows administrator (was until powershell :-).

First, RDP is equal to a service/role/feature on a server and even if you don't use the full Terminal Service from Microsoft, it is something you should care about. Most people care in way of applying critical hotfixes to servers or workstations on a regular basis and avoiding service packs for a long time. I think Microsoft already knows about this  and applies cool features in smaller updates and not only in full SP's.

Great isn't it?

NO, the problem most people are facing is that they are not enough into the Microsoft patch world (or don't have enough time to get into it) or just underestimate the chances hiding behind all the other (non-critical, non-security) updates out …
0
This is my 3rd article on SCCM in recent weeks, the 1st dealing with installation on a 2008 R2 host is here, the 2nd showing basic configuration is here. This article aims to provide a simple overview of windows 7 deployment for the novice SCCM user - its designed for those people using SCCM 2007 in a virtual/physical lab situation rather than those looking for production assistance.

SCCM2007 is a powerful multi-purpose system with many enterprise applications, one of which is deployment. Over the last few weeks I've been testing windows 7 deployment in a test-lab and have discovered a number of things 1) the guides which exist on the web are full of conflicting information and aren't entirely suited to novice IT professionals and 2) the process is easier than you think. With …
4
 

Expert Comment

by:Kashdotcom
Comment Utility
Hi:

I am sorry but I found your this article quite confusing. I did not understand where did you get win7professional.wim file?

The last part after installation and capture seems to be duplicate of what you showed in the beginning.

What is this path \\can-dev-dc2\SMS_CHQ\Client?

I could not complete my installation.

Kash
0

Microsoft Server OS

55K

Solutions

41K

Contributors

The Microsoft Server topic includes all of the legacy versions of the operating system, including the Windows NT 3.1, NT 3.5, NT 4.0 and Windows 2000 and Windows Home Server versions.