AWS

Amazon Web Services (AWS), is a collection of remote computing services, also called web services, that make up a cloud-computing platform  operated from 11 geographical regions across the world. The most central and well-known of these services include Amazon Elastic Compute Cloud, also known as "EC2", and Amazon Simple Storage Service, also known as "S3". Other services include Elastic MapReduce (EMR), Route 53 (a DNS web service),  provides a highly available and scalable Domain Name System (DNS) web service, Virtual Private Cloud (VPC), storage, database, deployment and application services.

Share tech news, updates, or what's on your mind.

Sign up to Post

Dear Experts,

Can we configure AWS CloudWatch to do the following monitoring roles?

1. Disconnection of instance to customer firewall as we configured VPC connection
2. Server instance suddenly shuts down or restarts unexpectedly.
0
CompTIA Security+
LVL 12
CompTIA Security+

Learn the essential functions of CompTIA Security+, which establishes the core knowledge required of any cybersecurity role and leads professionals into intermediate-level cybersecurity jobs.

Dear Experts,

I have done a 12 hour snapshot a day for the AWS instance.

Is there any way to move the snapshot to S3 Bucket using GUI only.

Can incremental backups be done for AWS instance other than snapshots?

Is there anyway to do a weekly backup of the same instance?
0
I want Ansible to run through a list of EC2 Instances and perform and AWS Command. I am able to do this with server at a time but cannot get it to do a list

---
- hosts: "tag_tool_ansible"
  connection: "local"
  gather_facts: false
  vars:
    ec2_instance_type: "t2.small"
    ec2_id:
       "i-0035105226cbb39e0"
       
  tasks:
  - name: some command
    command: aws ec2 modify-instance-attribute --instance-id {{ec2_id}} --instance-type {{ec2_instance_type}}



However as soon as I add a second server it fails

---
- hosts: "tag_tool_ansible"
  connection: "local"
  gather_facts: false
  vars:
    ec2_instance_type: "t2.small"
    ec2_id:
       "i-0035105226cbb39e0"
       "i-06608e259685b0db7"
  tasks:
  - name: some command
    command: aws ec2 modify-instance-attribute --instance-id {{ec2_id}} --instance-type {{ec2_instance_type}}


How can I reference a Var that is a list of servers without trying to configure ansible roles?
Note I the variables are coming from AWS Dynamic Inventory as I run the command as so

ansible-playbook -i ../ec2.py t2micro_to_small_list.yml
0
Regarding in AWS, Security group rule issue,

We have created security group and associated with EC2 Instance. In that security group rule we mentioned source as another security group(for example SG-12345) however the access it is not working but in that rule if we add source as particular IP Address or subnet it is working. We do not know what is the issue and why it is not working if we add source as another security group.

Please help me to suggest and fix the issue. Please do the needful.

Thank You
0
Is there a S3 API that can help me measure the time it takes for replication across two data centers?

I am trying to read the metadata (REPLICATED, PENDING, etc) to measure  the time. But the status always returns null to me.

Does this work only with CRR?
0
I have a Windows Server 2008 R2 server that is configured as an RDS server.  It was a physical box on site and we recently migrated it to AWS.

Everything seems to work fine since the migration to AWS aside from printer redirection.  

I have confirmed that printing is configured properly on the RDP configuration, the spooler is running and the server has the correct drivers installed for each printer.  I have confirmed Easy Print is turned on and that redirection is configured in the RDP client.  The server is joined to the domain and has a secure connection to AD.  Easy Print is enabled, but I have tried disabling it to no avail.

I've read about other people having issues with printer redirection on AWS.  Any help or suggestions would be greatly appreciated.
0
HI Experts,
could you please guide me on various resources available on exploring AWS architectures. I'm interested in things around devops, aws, terraform, mesos, containers, kubernetes etc.

1. Basically would like to understand how others have implemented a solution using various aws services rather than starting from scratch.
https://aws.amazon.com/this-is-my-architecture/

2. Also available resources where we can explore and get hands on experience like
https://qwiklabs.com/ 

Thanks in advance
0
AWS - two questions pertaining to AWS, or hosted services.  First off, I am trying to get an idea if AWS is a cost effective hosting replacement solution for small business where in house they have anywhere from 1-6 servers, from a cost perspective.  I took a peak at some of the sizing tools for AWS, but not clear on how I would assess throughput, if all daily functions were moved to AWS.  Would you also move all infrastructure servers and clients would authenticate to DCs in the cloud?  I am not sold on this at all, but it makes absolute sense to know the pros and cons.  From a control and cost and security perspective it probably still makes more sense to go with in-house servers for small business, but need to confirm.


Secondly - what is the best way to get familiar with AWS, for small Windows environments, trying to see if it makes sense from a business continuity stand point. For example, currently using Veeam to replicate all VMs to another remote host, so if we lose site A, we can spin everything up within minutes.  Any thoughts on best way, to get a good understanding of AWS?

Thanks guys!!!
Licompguy
0
Hi Experts

Could you give me any directions on how to sincronyze an AWS NoSQL-MongoDB table with a MS-SQLServer table located on a private cloud?

The tables must to have a real time sincronyzation.

My first idea is to do that by using a PHP web service.

Thanks in advace
0
Hi,

I am looking into purchasing AWS dedicated hosts for licensing reasons. I am assuming that when the hardware running a Dedicated Host fails, all instance hosted on it also fail. Does anyone know what happens in this event?

Thanks,
Adrian
0
Learn SQL Server Core 2016
LVL 12
Learn SQL Server Core 2016

This course will introduce you to SQL Server Core 2016, as well as teach you about SSMS, data tools, installation, server configuration, using Management Studio, and writing and executing queries.

We have a Windows 7 workstation that runs indexing software for a document management system.

We are in the process of moving our clients whole infrastructure to the cloud.

The migration software we are using will not move a machine that is formatted with FAT32 to the cloud.

I've P2V'd the machine and having tried using the AWS Import feature, but the upload consistently fails when uploading to S3.

I've thought about breaking the VHD into multiple files and attempting to upload each file individually.

Has anyone encountered this issue before or have any alternate suggestions?

Yor input would be greatly appreciated.
0
Hello,

I'm looking for a cost effective, automated solution for AWS/Glacier offsite backups.  I understand the fundamental differences between Glacier and regular storage in AWS (atleast I *think* I do).

Here are the items that need backed up regularly:
(2) Local machines (Win7/Win10)
(1) Plex Media server
(1) WD NAS (stores backup of Plex Media, plus non-automated backups of local machines)

Local machines are currently backed up in irregular intervals manually with Beyond Compare to the NAS.
Media files from Plex are automated backups to NAS.

Manually once a year using Beyond Compare we run a full comparison on the media files on AWS and the NAS.
Is there an easier automated way to zip these media files (so there's less Put/Get/List Requests) and only have updated/changed zipped files uploaded to S3?  To automate individual file scanning/comparison on a routine basis gets "expensive".  I can't control costs related to amount of storage needed - that's a given, but when I run Beyond Compare against the S3 storage - the amount of requests to compare each file for changes gets extreme - upwards of $50/month, which maybe isn't expensive to some, but for a home set-up that's a little much IMHO.

The local machine Beyond Compare backups to the NAS are just file replication - copy/overwrite so it's not even a great backup solution - but atleast if a drive on a local machine dies, mission critical files exist somewhere else... not the best by any means, but …
0
Description of issue:
  • We are attempting to migrate data and continue to replicate from our local MYSQL database to Aurora MySQL. Aurora is reporting that we have successfully completed a migration however when we check the data not all the records have migrated after a certain point. After this point no more of the original records are copied however the database continues to replicate changes after the initial full load. Leaving a gap of missing records.
  • This problem only occurs with tables with BLOB fields. Even though tables with BLOB fields do not full load all records the task reports 100% completion and no tables errored.
  • As stated replication continues as normal afterwards and new rows are inserted.
  • Approx 350,000 records are missing
  • Full LOB mode is enabled with default chunk size of 64.
  • We have set max_allowed_packet and wait_timeout on the target to their maximum values 1073741824 and 31536000 respectively.
  • We also reduced the commit rate during full load to 1000. We found that the default values for these would cause the full load to fail and continually restart.

Source: On site MySQL 5.7.20 Community over VPN
Target: Aurora MySQL 5.7.12 db.r4.large
Replication instance: dms.r4.large, version 3.1.2, 80GB storage
Migration type: Full Load, Ongoing Replication
Network transfer: Over VPN to the VPC
Size of transfer: ~700 tables totalling nearly 50GB
 
Example:
0
I was wondering with Splunk can work with AWS ElasticSearch?
0
Hello!

I'm new to building an Amazon Alexa skill and the course I'm taking seems to be a bit outdated. In the course, it says to use event.request.intent to get the intent of the user. However, I'm getting an undefined error. Should I be using something else to get the intent?

exports.handler = function(event, context) {
    console.log(event.request.intent);   
}

Open in new window


Produces this error
TypeError: Cannot read property 'intent' of undefined
Screen-Shot-2018-11-09-at-2.07.39-PM.png
0
I am trying to deny AWS services based on users outside an accepted IP address range. I am trying to use a cloudformation script in yml to create a policy. I am a bit new to yml so any advice/help would be appreciated.
deny-ipaddress.yml
0
So we currently have a Cisco ASA 5512-X, v9.2.

We are currently on split tunnel for VPN, however, we want to move away from split tunnel as it causes routing issues for us to AWS.

Is there a good way for me to build out another VPN interface and apply new profiles/rules to test?
0
AWS is not picking up the laptop camera. Camera is definitely working in the windows 10 environment.  My understanding is that webcams do not get picked up by AWS client not sure if it is true for laptop's build-in camera
0
Dear Experts,

I have a brief idea of Amazon Web Services.

I know that you create  an instance: virtual server/PC in the cloud.

The wizard is there to guide and you have to generate and download the key pair in order to access it.

I also know that S3 bucket is used to store the backup of Amazon EC2 instance, but where can I get information on how to do the backup to the S3 bucket using GUI instead of CLI?
0
JavaScript Best Practices
LVL 12
JavaScript Best Practices

Save hours in development time and avoid common mistakes by learning the best practices to use for JavaScript.

15 or 20 times a day we see an error like the one below on our lambda instance.
One thing that jumps out is that the source address is a link local address instead
of a normal private (or public) address. Is that normal for lambas to try and use
a link local address as source to their destinations?

There appear to be no network error over DX, no bandwidth problems and thousands
of other connections per hour are successful. It's this small subset we're trying to figure out.

read tcp 169.254.176.149:58672->10.170.10.15:443: read: connection reset by peer
0
Our Active Directory domain says is contoso.com, and our cooperate URL is the same https://contoso.com. URL is publically hosted on AWS and has elastic FQDN, In order to make the URL accesible on internal Newtwork, IT team has created A CNAME record against Public FQDN, But DNS services don't let us create a CNAME with Blank Fields stating "A new record cannot be created. An alias (CNAME) record cannot be added to this DNS name. The DNS name contains records that are incompatible with the CNAME record".  

For now we have created a CNAME with www, with this we open the URL as www.contoso.com, but we want to open it without www internally.
0
ASP.NET Core Web Client (RAZOR) Log In using AWS Cognito user pool and AWS .NET SDK to log in user in asp.net core web client

How to use AWS cognito user pool to authenticate and authorise ASP.NET Core WEb Client and ASP.NET Core Web API.

I already created a AWS Cognito User pool and App CLient.. I followed the below article from AWS

https://aws.amazon.com/blogs/mobile/use-csharp-to-register-and-authenticate-with-amazon-cognito-user-pools/

reached the point

      var cognito = new AmazonCognitoIdentityProviderClient(_region);
            //var cognito = new AmazonCognitoIdentityProviderClient(credentials);


            var request = new AdminInitiateAuthRequest
            {
                UserPoolId = _aWSConfig.PoolID,
                ClientId = _clientId,
                AuthFlow = AuthFlowType.ADMIN_NO_SRP_AUTH
            };

            request.AuthParameters.Add("USERNAME", "test@test.com");
            request.AuthParameters.Add("PASSWORD", "P@ssword12");

            var response = await cognito.AdminInitiateAuthAsync(request);

            return strToken = response.AuthenticationResult.AccessToken;

1. what are the next steps so that asp.net core web client is aware the user is logged in ??

example the below are set

User.Identity.IsAuthenticated

User.Identity.Name


User.Claims

2. what other details from token need to be stored where and how in ASPNET Client so that these can be used to send in HTTPCLient request …
0
I recently configured local Certificate authority server. since then our website hosted on AWS with valid Certificate gives error message.

Your connection is not private
Attackers might be trying to steal your information from <mydomainname>.org (for example, passwords, messages, or credit cards). Learn more
NET::ERR_CERT_COMMON_NAME_INVALID
 
can you please let me know i need to do to make sure that accessing the website from internal network does not use or by pass local certificate?

Thank you
0
Hi,

I am working through the process of automating my backups in AWS. I have a process to take VSS snapshots of my volumes, and I also have a separate script in AWS Lambda which can automatically take an AMI. What i'm now looking to do is combine these in to one single function. The lambda script does take a snapshot, however I'm not certain that it's a VSS2 snapshot. I've googled the issue but all the articles I've come across seem to describe the two processes as separate entities.

This is the Python script I'm using in Lambda to take an AMI:

# Automated AMI Backups
#
# @author Robert Kozora <bobby@kozora.me>
#
# This script will search for all instances having a tag with "Backup" or "backup"
# on it. As soon as we have the instances list, we loop through each instance
# and create an AMI of it. Also, it will look for a "Retention" tag key which
# will be used as a retention policy number in days. If there is no tag with
# that name, it will use a 7 days default value for each AMI.
#
# After creating the AMI it creates a "DeleteOn" tag on the AMI indicating when
# it will be deleted using the Retention value and another Lambda function

import boto3
import collections
import datetime
import sys
import pprint

ec = boto3.client('ec2')
#image = ec.Image('id')

def lambda_handler(event, context):
   
    reservations = ec.describe_instances(
        Filters=[
            {'Name': 'tag-key', 'Values': ['backup', 'Backup']},
        ]…
0
How to find SQS header value in a AWS message. we are using a java pojo to access the message and setting the message value. how can we find the message header value.

 can somebody help.

Thank you
0

AWS

Amazon Web Services (AWS), is a collection of remote computing services, also called web services, that make up a cloud-computing platform  operated from 11 geographical regions across the world. The most central and well-known of these services include Amazon Elastic Compute Cloud, also known as "EC2", and Amazon Simple Storage Service, also known as "S3". Other services include Elastic MapReduce (EMR), Route 53 (a DNS web service),  provides a highly available and scalable Domain Name System (DNS) web service, Virtual Private Cloud (VPC), storage, database, deployment and application services.