Cloud Computing





Cloud computing, also known as on-demand computing, is a kind of Internet-based computing where shared resources and information are provided to computers and other devices. It is a model for enabling ubiquitous, on-demand access to a shared pool of configurable computing resources. Cloud computing and storage solutions provide users and enterprises with various capabilities to store and process their data in third-party data centers, relying on sharing of resources to achieve coherence and economies of scale.

Share tech news, updates, or what's on your mind.

Sign up to Post

is it possible to create a template with out vcenter , I dont see the option if I log right into the ESXi , my vcenter is removed and broken and I want to see if there is a way to create a templae from a VM but do not see the option
Keep up with what's happening at Experts Exchange!
LVL 12
Keep up with what's happening at Experts Exchange!

Sign up to receive Decoded, a new monthly digest with product updates, feature release info, continuing education opportunities, and more.

verbose info neededhow can I get more verbose information with cloud formation templates in AWS ?

for example below , how can I get more information on the circled item , how can I see a verbose log on this ?\

how can I place a key in a AWS RHEL instacne with "user data"

is it possible to create a "here" script and have it execute on AWS instance "user data"?

like this

user data with a bash script in it#!/usr/bin/env bash
mkdir -p /home/ec2-user/.ssh
cat <<EOF >  /home/ec2-user/.ssh/docker-master-key.pem

I want to be able to pass a key to an instance upon boot, I have used just this above and it does not seem to do anything at all , I can not find any errors , how can I find out logging from this user data ?

AWS SSM "run " commnand" to pull down a file from s3 bucket and place it into a specific directory

I am attempting to run a shell script that copies  a file ( a pem.key)  from  an S3 bucket , to an AWSes 2 instance

I have the knowlege to get to the run command and run shell scripts no problem , but I have NO IDEA how to copy a file from S3 and place it into the

~/.ssh/ directory ,

the envental goal it to have an applicaion (ansible ) connect to each instance

I think I will first have to copy the PEM key and then that will allow me to copy the key to each instance

how can this be done , I am very new to this part of it

thanks !!
how can I ssh to a linux box and copy the key to the destination in one command

someone showed me this once it is l or ssh-L or somelike that
A Mac OS X user was running an older version of Google Drive on her Mac to connect to a corporate Google Drive structure where our organization stores files within the Google cloud.

A week ago she received a prompt to upgrade to Google File Stream and proceeded to perform the upgrade.

She then uninstalled Google Drive.

Now her computer has been having persistent issues updating existing files and downloading new files.

When we open Google File Stream it always says that it is running and syncing. Also her computer constantly maintains a connection to the internet.

She is able to fix the issue by rebooting her Mac and then it will sync the latest changes and download the newest files. However, she has to reboot her computer several times a day.

What can be done to fix this issue?

  I am using Cloud Control 13c configured and running for my databases and hosts.

  How can i get and configured Cloud 13c to get notifications for things like low disk space or high memory usage, low tablespace and so on?

  Where can i look and set those configurations and make it work.


  Joe Echavarria.
I am following tutorial of heroku for start up and I reached so far Push local changes
However, It wasn't successful :(( .
I don't know where I went wrong or what should I do , to make the cowsay example work
I have scenario in hybrid environment Exch 2K10 & O365

O365 user trying to send mail to Dynamic distribution group which is in on premises.

Error:- 550 5.7.1 resolver.rst authrequired

Require that all senders are authenticated is checked, I don want to uncheck this for security reasons. Can you please suggest wok around for this one.

We are not creating any objects in cloud, so creating contact with same dynamic group name is ruled out
Hi All,
 I am using Graphana on Elasticsearch to show % pass/fail in weekly reports, I can get Pass/Fail  from the Pie chart but not in Stacked bar graph ..
Is there any plugin which i can use to get the pass fail percentage in Bar Graph.

The 14th Annual Expert Award Winners
The 14th Annual Expert Award Winners

The results are in! Meet the top members of our 2017 Expert Awards. Congratulations to all who qualified!

How do I seamlessly move workload such as my VMwares into AWS EC2s or AWS containers.. and seamlessly move them back into VMware on my data centre premise using any tool out there - will be only using AWS for now.
I had been using the Azure Backup Agent to backup files from an Azure VM. Backups were running to an Recovery Services vault.
I have since moved to another backup solution for this server (Veeam backup & replication).

I don't want to continue paying for Azure recovery services vault. Is there any way I can export the backup & retention data before deleting the Vault?
We are planning to setup AD DC server in AWS. We have two office sites where medium scale firewall present. I Need your assistance how to plan for it. we have around 40 employees in both the location. We need to apply policy for the user who used to work from home. Kindly assist on this.
Hello Experts,

Can someone please let me know the major benefits of using Hashicorps Terraform instead of Azure ARM?

I know that Terraform is provider agnostic (it can be used with any cloud provider), and I'm aware that its more readable than Azure JSON script.

However, because I no desire to train on AWS, or on Google's Cloud platform, I would like to know what how I would benefit from learning and adopting Terraform if I plan on staying an Azure Cloud solution engineer?


Hello Experts,

I have run a hql called samplehive.hql, see attached. However, the script fails with the following error:

FAILED: ParseException line 1:2 cannot recognize input near 'D' 'R' 'O'
18/01/17 20:46:46 [main]: ERROR ql.Driver: FAILED: ParseException line 1:2 cannot recognize input near 'D' 'R' 'O'
org.apache.hadoop.hive.ql.parse.ParseException: line 1:2 cannot recognize input near 'D' 'R' 'O'

I'm very new to Hadoop Hive, can someone take a look at the script and let me know where I'm going wrong

As I have mentioned in one of my previous questions, as a business were are now procuring more and more cloud services.  
I’m trying to examine our procedures and governance to make sure that we procure cloud solutions that are fit for purpose and secure.
One of the controls I have thought of is to have a central repository of our cloud providers as we already have numerous different suppliers and I don’t think as a business we have one go to document that shows the provider of the cloud solution hosting our data, an emergency contact, location of data etc etc.  I have been reading through best practice guides and frameworks but I can’t see anything that suggested that this is something that should be implemented.
It seems like a good idea to me but can anyone point me in the direction to any best practice, frameworks, guides etc that support this to take to management as a bit of backup.
Is it something that is done in your own businesses?
Are there any real benefits to this and are there any risks to not implementing such a control?

Unless of course you feel this is not a worthwhile control to have in place, in which please feel free to say.
Anyone using SkySync Enterprise connected straight through to StorageCraft's cloud? StorageCraft hasn't completed their Box sync yet. Trying to determine if it's possible to Sync between Box and StorageCraft data center using SkySync.
I’m about to start some work looking at how our business procures cloud services.
We are a large business with many different departments and an in-house IT team.

We already have some of our systems hosted via SaaS and these have followed correct procedures in terms of ensuring data is appropriately secured etc.  
My concern is that some departments might be going off and procuring cloud services with no input or support from IT.  This obviously poses several high risk issues, especially when you consider some of the data may be private, confidential or commercially sensitive and as such needs to be carefully controlled.

I have two questions:
1)      How do other businesses mange the procurement of cloud services to ensure that appropriate security and governance issues have been considered and how can you control this to ensure that individual departments can’t just go and do this themselves with no IT/Business input.
2)      How can we determine if we have any data out there in the cloud that we are centrally unaware of and that have had no corporate input.  I’m thinking of analysing our finance system to determine if any payments have been made to cloud providers, this might not be easy as not all providers have obvious names.  Reviewing our contracts registers to identify any cloud providers.  Reviewing corporate credit cards for similar to see if anything has been purchased on credit cards.

The aim of this is to ensure there are suitable frameworks/procedures/policies…
What are the important factors to consider while migrating to cloud and the best practices needed to protect your investment.
Train for your Pen Testing Engineer Certification
LVL 12
Train for your Pen Testing Engineer Certification

Enroll today in this bundle of courses to gain experience in the logistics of pen testing, Linux fundamentals, vulnerability assessments, detecting live systems, and more! This series, valued at $3,000, is free for Premium members, Team Accounts, and Qualified Experts.

I have several sites with on premise VMware setups running numerous applications. I want to move all out to cloud. The challenge I have is what fit is best. I’m leaning toward private data centre as I can transition swiftly and all resources are dedicated and will be resilient.
Iaas has been mentioned but I’m not sure if it’s economical as I have over 120 vms.

Interested to hear from other people who were in this scenario and what they chose. Thanks
I am looking for a Cloud provider for one of my clients

30 Users
1 Server
132 Gigs of Data (Currently)

My Request is for those who use and resell cloud products

Looking for reliability and Price.

Thanks,   Cjoego

I am new to Hadoop.  I have a question regarding yarn memory allocation.  If  we have 16GB memory in cluster,  we can have least 3 4GB cluster an keep 4 GB for other uses.  If a job needs 10 GB RAM, would it use 3 containers or  use one container and will start using the ram rest of the RAM ?

terraform v0.10.7
google cloud platform
various .tf files for creating backend, variables etc


i am able to create multiple vm instances and also multiple additional disks (boot_disk is working fine on each instance) but I want to be able to attach those additional disks to each vm accordingly without having to have individual resource adds for each vm (if that makes sense!).

the code I have so far is (which works ok for building multiple compute instances and also multiple additional disks): note (I have commented out the attached_disk which errors atm)

*** START ***

variable "node_count" {
  default = "3"

resource "google_compute_disk" "test-node-1-index-disk-" {
    count   = "${var.node_count}"
    name    = "test-node-1-index-disk-${count.index}-data"
    type    = "pd-standard"
    zone    = "${}"
    size    = "5"
resource "google_compute_instance" "test-node-" {
    count = "${var.node_count}"
    name = "test-node-${count.index}"
    machine_type = "${var.machine_type}"
    zone = "${}"

    boot_disk {
    initialize_params {
    image = "${var.image}"
#    attached_disk {
#        source      = "${google_compute_disk.test-node-1-index-disk-(count.index)}"
#        device_name = "${google_compute_disk.test-node-1-index-disk-(count.index)}"
#   }

    network_interface {
      network = "default"
      access_config {
        // Ephemeral IP
I am currently setting up a training room where I want everything to be automated. I have door style windows on a slanted roof that currently needs to be opened manually. The problem is that the windows are very high, and it is not practical to try and open them manually.

They are side hung escape windows with wooden frames and dimensions 660x1180mm.

I want to convert the windows so they can open automatically, preferably using a remote or voice commands using my SmartThings hub or Amazon Alexa. Would this be possible?

Would you be able to guide me to the type of window opener/actuator that would best suit my window and what else I would need to get to be able to use my SmartThings hub or IFTTT event triggering.

Any solutions would be much appreciated.

This very well could be and/or end up in the gigs section, but i need to collect some more information - education.  

I have an access database that, put simply,  tracks real estate comparable sales (data/pictures) , ties it together with a subject property,  and then outputs a report....

I wish to recreate this database so that it will be able to be access by multiple people in two - possibly more locations.  I've been reading up on Google Cloud - various flavors of SQL and think it can be done, but it I am more willing to pay someone to do this rather than learn something new...  I do however need to understand why choosing solution is A is better than solution B.

Cloud Computing





Cloud computing, also known as on-demand computing, is a kind of Internet-based computing where shared resources and information are provided to computers and other devices. It is a model for enabling ubiquitous, on-demand access to a shared pool of configurable computing resources. Cloud computing and storage solutions provide users and enterprises with various capabilities to store and process their data in third-party data centers, relying on sharing of resources to achieve coherence and economies of scale.