We help IT Professionals succeed at work.


Microsoft Azure is a cloud computing platform and infrastructure for building, deploying and managing applications and services through datacenters. It provides both platform-as-a-service (PaaS) and infrastructure-as-a-service (IaaS) services and supports many different programming languages, tools and frameworks, including both Microsoft-specific and third-party software and systems. Cloud Services is a PaaS environment and can be used to create scalable applications and services; there are specific software development kits (SDKs) provided by Microsoft for Python, Java, Node.js and .NET. Azure also has file and storage services, data management, analytics and DNS services.

I have several co-workers working from home.  I sent them instructions on how to vpn into my Windows Server 2016.

Is there a log I can view, which would indicate "when"  (date/time), a co-worker creates a vpn connection to the Server?
Users have had issues starting about 2 weeks ago where multiple people will have their Office Apps listed as unlicensed daily.  The only methods that have worked to re-active is running the Office365 activation tool and several only needed a pending "update" ran.  These licenses in collaboration with Intune was put in place spread out over the past 60 days, but there is no app policy in place on the devices or users.

Users had not changed their passwords the day or day before this happened and no changes had been made to the subscription themselves.  The devices they were users have been listed as activated and assigned to them in Office365 each of these times.
I was just wondering whether or not Azure AD Connect sync SharePoint Groups and AD distribution groups to SharePoint online


As always many thanks in advance for all insights.
We are running AD with two 2016 DC with the usual GC/DNS, etc. on a 2012R2 FFl/DFL, but still have WINS installed on both (replication). Not my choice and I want to get rid of it as I have assurances that none of the current applications or servers running them still need WINS. I note we also have a SharePoint 2013 and Exchange 2013 (cluster). What would be the best approach for this? I could remove WINS entries from the static IPv4 entries on each server and then see over a 24 hour window if anything breaks which sees the safer option or just turn off WINS on the two DC, see what happens over 1 week and if nothing then remove the roles altogether? Thoughts? Cheers!
Having the challenge to get the set manager to work.
The permissions are right in my app as long as I have understood.
The app also has the Company Administrator and the Helpdesk Administrator role.
If those roles also have User.ReadWrite.All, Directory.ReadWrite.All everything should be alright?

In Powershell it works (but in another authentification context of course)
Set-AzureADUserManager -ObjectId $user.ObjectId -RefObjectId $manager.ObjectId
Then I can see the Manager with get-AzureADUserManager -ObjectId $user.ObjectId

PUT https://graph.microsoft.com/v1.0/users/zzzzzzzz-zzzz-zzzz-zzzz-zzzzzzzzzzzz/manager/$ref
Content-type: application/json
Content-length: 92

{"@odata.id": "https://graph.microsoft.com/v1.0/users/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"}

Content-length is the length of the JSON string, and I calculate it with strlen in PHP. I hope that's correct.

But this gives me no response. Which is normal, I guess. But nothing happens actually. If I enter a wrong email or wrong format in the input, I get a related error message. So a part of it works. But not the set stuff.
Getting this error when running this script, never worked with PnP before, but this is the error I get at the bottom of the script.  Haven't yielded much searching on this, I have the latest modules installed.

format-default : The collection has not been initialized. It has not been requested or the request has not been executed. It may need to be explicitly requested.
    + CategoryInfo          : NotSpecified: (:) [format-default], CollectionNotInitializedException
    + FullyQualifiedErrorId : Microsoft.SharePoint.Client.CollectionNotInitializedException,Microsoft.PowerShell.Commands.FormatDefaultCommand

Any suggestions on where I might look what is causing this or what I might be missing is appreciated.  

Full script is located here on GitHub:  https://github.com/SmartterHealth/Virtual-Rounding/blob/master/Scripts/CreateTeamsAndSPO.ps1

This sample is provided as is and is not meant for use on a production environment.
It is provided only for illustrative purposes. The end user must test and modify the
sample to suit their target environment. 

Microsoft can make no representation concerning the content of this sample. Microsoft
is providing this information only as a convenience to you. This is to inform you that
Microsoft has not tested the sample and therefore cannot make any representations 
regarding the quality, safety, or suitability of any code or information found 

Open in new window

Windows Server 2016 Standard with Hyper-V running two VMs. One with Server Essentials, the other just another Server 2016.

In the hyper-v manager, you can see the two VMs at the top: RDS and ESS.

When I want to turn them off, I will right-click on ESS and select shut down from the drop down menu. It shuts down. When I then go to the RDS VM and do the same thing I get the following error message:

An error occurred while attempting to shut down the virtual machine(s)

"Failed to shut down the virtual machine

You do not have permission to perform the operation. Contact your administrator if you believe you should have permission to perform this operation."

I then have to use Turn Off, which I don't like to use. I then get the message:

Are you want to turn off the selected VMs? This is equivalent to powering off a computer, so data loss is possible.

I have no choice so I select that. But, it scares me, because I don't do hard reboots on workstations so I feel like this is the same thing. I believe I have admin rights on the production server. Maybe I don't. Maybe I shouldn't"

But the ESS works ever time.

Robocopy cannot find specific directory Error 3.cmd robocopy script  Helllo Experts, I am trying to copy a folder (Win 2008 server to a Win2016 server) using robocopy). I am getting a permissions error message. I am using an admin account. Any suggestion on how to correct this?

I’m running vSphere 6.7 and a VM which is my SCOM Server is having high CPU and I/O. How can I report performance on this server and attempt to isolate the service for which is pegging resources? How would an expert approach this scenario?

Thank you so much!
Looking for help in getting input from a .csv file for copying group members from one group to another.  Both are Unified groups.

This code works for a single group to group copy:
$List = Get-UnifiedGroupLinks -Identity "Group1" -LinkType Members | Select-Object -expandproperty name
Add-UnifiedGroupLinks -Identity "Group2" -LinkType Members -Links $List

Open in new window

This is what I setup in for foreach:

$Groups = Import-Csv -Path .\GroupInput.csv
foreach ($Group in $Groups) {
    $List = Get-UnifiedGroupLinks -Identity $_.SourceGroup -LinkType Members | Select-Object -expandproperty name
    Add-UnifiedGroupLinks -Identity $_.TargetGroup -LinkType Members -Links $List

Open in new window

Error I'm getting is:
Cannot bind argument to parameter 'Identity' because it is null.
    + CategoryInfo          : InvalidData: (:) [Get-UnifiedGroupLinks], ParameterBindingValidationException
    + FullyQualifiedErrorId : ParameterArgumentValidationErrorNullNotAllowed,Get-UnifiedGroupLinks
    + PSComputerName        : outlook.office365.com

What’s the best and secure way to setup a web server within a DMZ? A couple of simple external sites and sql server. Should I put a DC RODC inside DMZ or just open up ports to sync AD? or should I not have it connected to AD?
I ran this script. It contacts AD but does not respond with results. It just waits for another entry.

C:\Save> Get-ADUser -Filter {mail -like 'emailaddress'} -Properties * | fl workid
C:\Save> _

It looks like above

In Azure  the page file is by default in D: drive and set to: System managed files

Is it better to leave it as is or can we just set to manual?

I saw an application pool has used all memories and the system just assign 2 GB of page file off 15 GB. Why it does assign more???

I just need your opinion and experience.
I have following JSON file and i need update the displayName to contain either QA or Production :

    "analytics": [
        "kind": "Scheduled",
        "displayName": "RO-0024-DEV GPO Scheduled Task",
        "description": "test",
        "severity": "Low",
        "enabled": true,
        "query":"SecurityEvent | where EventID == \"5145\"  | where AccountType == \"User\"",
        "queryFrequency": "5H",
        "queryPeriod": "6H",
        "triggerOperator": "GreaterThan",
        "triggerThreshold": 5,
        "suppressionDuration": "6H",
        "suppressionEnabled": false,
        "tactics": [
        "PlayBookName": "Test447"


Open in new window

i am running a powershell script via Azure Devops build agent and depending on the Tennant, i want to change the DisplayName that contains Dev to QA or Prod.

i have tried the following for testing however it fails:

$rules = Get-Content -Raw -Path .\TestJson.json | ConvertFrom-Json
$rules.analytics.displayname = "test"

Open in new window

i get the following error:

The property 'displayname' cannot be found on this object. Verify that the property exists and can be set.
At line:1 char:1
+ $rules.analytics.displayname = "test"
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [], RuntimeException
    + FullyQualifiedErrorId : PropertyAssignmentException

Open in new window

Thank you in advance.

Hi all,

How would I get the below batch file to report outcome to a text file

echo off
net use P: /delete /yes
taskkill /F /IM explorer.exe & start explorer
net use P: \\(servername)\personnel\%username% /persistent:yes
Hello all;
Windows 2016 Server CORE
Installing = SQL Server 2019

Windows Installer was not running - started it

I took the configuration.ini file from the SQL Server setup I did earlier.
I edited it to work with Server Core.
Here is the following edit.
;SQL Server 2019 Configuration File

; By specifying this parameter and accepting Microsoft R Open and Microsoft R Server terms, 
; you acknowledge that you have read and understood the terms of use. 


; Specifies a Setup work flow, like INSTALL, UNINSTALL, or UPGRADE. This is a required parameter. 


; Specify a default or named instance. MSSQLSERVER is the default instance for non-Express editions and SQLExpress for Express editions. 
; This parameter is required when installing the SQL Server Database Engine (SQL), or Analysis Services (AS). 


; The default is Windows Authentication. Use "SQL" for Mixed Mode Authentication. Remove ; to switch on mixed mode

; Specify the root installation directory for shared components.  
; This directory remains unchanged after shared components are already installed. 


; Specify the root installation directory for the WOW64 shared components.  
; This directory remains unchanged after WOW64 shared components are already installed. 


; Specify the Instance ID for the SQL 

Open in new window

hello Team

I am using a GLPI container( docker image)  i tried to setup every thing but during the install i have an error  I even changed the image and i having the same error

I have tried to install this as a standalone as well as a farm. I have also tried to use the ARM templates with Azure. I keep having the same issue.  I have searched the web can not find the correct solution.  Has anyone had luck making it all the way through the pre-requisites without seeing this issue? Maybe there was an issue with the configuration of the App Fabric?
perHello All;

I have just run into a situation where I cannot find any information on it.
Installation of SQL Server 2019 WITHOUT installing Data Quality Services
If I uncheck it, it unchecks EVERYTHING.
So, the question is...
How do you install the SQL Server Database Engine, without installing the Data Quality Services

The reason:
I am installing the "SQL Server Failover Cluster" on the 2016 Desk Experience Server.
After this, I will be installing "Add node to a SQL Server failover cluster" on the 2016 CORE Server.

From my experience, adding the NODE fails every time I have tried to install it on the core.
A person posted but without proof that the Data Quality Services does not run on Server Core.
(This is my thread about the issue)

I am building my SQL Servers Clusters from scratch, and want to get the Core servers running SQL Server database engine.
And whatever I install on the Main SQL Cluster, will filter down into the NODE installations.
So I need to be cautious about what gets installed to make sure everything is going to go smoothly on this build.
These are the typical features that get installed on the NODE's
SQLEngine, SQLEngine\Replication, SQLEngine\FullText, SQLEngine\DQ
So please, if anyone has any information on this TA or the other TA, please let me know.

I have an external domain that is being advertised via our zone files. For the setup of AAD Connect the connector to synch our internal AD with the Azure cloud, it is asking us to create either TXT or MX records on our zone file for our local domain i.e. ABC.local domain which our users exist on.

Do we need to create this under my external DNS, ABC.com.fj domain or do I create a seperate domain with a seperate DNS record for my ABC.local domain?
In order to issue certificates to local domain systems, a Systems Admin stood up a MS CA instance on a server 2012R2 server which was also one of two Domain Controllers for the domain. The The initial CA was not correct, so he rebuilt it on the SAME machine using a difference CA name. When things didn't go right again, he rebuilt a third instance on the SAME computer, again giving it a new name. There were three root authorities, one of which actually worked most of the time. CA, CA-1, and CA-2.

The first time I noticed that there was a problem was when I was tasked with configuring the domain for Multi-Factor Authentication and auto-enrollment of computers and users. I knew nothing about CA's so I took to reading and discovered that the first SA had installed the root authority on the wrong place. After reading MS articles and papers about moving from one computer to another, I stood up a Server 2016 instance and restored the CA-2 onto the new computer. I removed MSCA from the DC and went through the steps of decommissioning the old CA server by removing the entries in Sites and Services. When I run certutil, I get only one entry in the dump and it is the correct server. Web services have been operating fine and certificates are validated at the CA. As far as I knew, I had successfully relocated the Cert Authority to a new server off of the DC.

I used the CA to issue a cert to our Exchange Server and everything was fine for almost a year. Recently, after …
We sync our on premise AD users to Azure/O365, and I am having problems updating the  UPN, proxyaddresses and msRTCSIP-PrimaryUserAddress of a user - userA

The sync conflict error states the value  already exists for another user's SipProxyAddress in Azure -userB. UserB is also on prem AD user, which doesn't have that value in on prem AD, but I am guessing this hasn't replicated to clear from Azure either.



UPN - userA@domain.com
Proxyaddresses: smtp:userA@domain.com, sip:userA@domain.com
msRTCSIP-PrimaryUserAddress: sip:userA@domain.com

UPN - userAAA@domain.com
Proxyaddresses: smtp:userAAA@domain.com,


UPN - userB@domain.com
Proxyaddresses: smtp:userB@domain.com,

UPN - userB@domain.com
Proxyaddresses: smtp:userB@domain.com,

Is there a way to selectively sync these users separately(or at least filter out conflicting attributes), so userB can sync first and hopefully clear its sipproxyaddress from Azure, and then sync userA in a separate sync cycle? Or is there another way to fix this?

Thank you!
We are currently on domain controller 2008 R2 and we are looking at introducing domain controller 2016 and co-exist it 2008 R2 domain conrtoller. Would like to remain the functional level at 2008 R2 for now on 2016 dc. Can this be done and will it be supported by Microsoft for the 2008 R2 functional level on 2016 domain contoller. In addition to this we have handful of 2003 servers and we would like to know if the authentication is still supported.

Based on our research we found that "Operating Systems like Windows XP, Server 2003 and 2008 (not SP2) are not supported" from the link below. Appreciate if this can be validated.


I need to use ROBOCOPY to copy all folders from \\SOURCESERVER\d$ (Please screen shot) to a shared drive of a new server \\NEWSERVER\CENTRAL

Can I use something like the below?

ROBOCOPY "\\SOURCESERVER\d$\BE Disaster Recovery Server Files" "\\NEWSERVER\CENTRAL" //MIR /SEC /W:0 /R:0 /IS /LOG:c:\copy.log

I want all folders to be shown in \\NEWSERVER\CENTRAL with all original security permission settings.  Please check if the parmeter I use are correct.

Hi Experts,

I have some questions regarding DC upgrades.

The forrest and domain Level is 2008R2.

I have to install many WIN2019 DCs.

What is the minimum level I need ?
How to raise the level in steps ?
Which exchange version is supported ?


Microsoft Azure is a cloud computing platform and infrastructure for building, deploying and managing applications and services through datacenters. It provides both platform-as-a-service (PaaS) and infrastructure-as-a-service (IaaS) services and supports many different programming languages, tools and frameworks, including both Microsoft-specific and third-party software and systems. Cloud Services is a PaaS environment and can be used to create scalable applications and services; there are specific software development kits (SDKs) provided by Microsoft for Python, Java, Node.js and .NET. Azure also has file and storage services, data management, analytics and DNS services.