Amazon Web Services (AWS), is a collection of remote computing services, also called web services, that make up a cloud-computing platform  operated from 11 geographical regions across the world. The most central and well-known of these services include Amazon Elastic Compute Cloud, also known as "EC2", and Amazon Simple Storage Service, also known as "S3". Other services include Elastic MapReduce (EMR), Route 53 (a DNS web service),  provides a highly available and scalable Domain Name System (DNS) web service, Virtual Private Cloud (VPC), storage, database, deployment and application services.

Share tech news, updates, or what's on your mind.

Sign up to Post

Hi expertsAWS opened ports for reference,

   My web application is running on aws ubuntu 16.  I have the following ports open from aws.  
It has the following applications running on ubuntu

uwsgi with nginx -  i  Think it uses port 80
node.js with react - react is rendering on port 9009
webpack with webpack.config.js - which creates js file which will be using port 80
npm run django: runserver  uses port 8000
elastic search :9200 uses port 9200
postgresql used port:5432

Please see the opened ports on aws.  screenshot for reference.

With putty I connect to linux ubuntu machine with the following ip address

There is docker application which runs inside that linux box uses nginx, postgres, elasticsearch

When I run docker application, and I able to see from browser with

Where as when I run python runserver I am not able to see the application.

The site can't be reached refused to connect error.

Whereas I can see the docker application when the docker is run.

Please help me why the python application is not running on the browser, where as Docker application runs well.

With python runserver 0,0,0,0:8000 I am running from the source code.  Whereas with docker I am running the docker image.

Both are same application.  

Please help me in fixing this issue.

with many thanks,
Bharath AK
The new generation of project management tools
The new generation of project management tools

With’s project management tool, you can see what everyone on your team is working in a single glance. Its intuitive dashboards are customizable, so you can create systems that work for you.

HI Experts,
could you hep with this tricky syntax please? I'm trying to execute an aws cli within python3.

Simple aw cli works fine, but could you help me with nested aws cli

This one works fine.
p = Popen(['aws', 'ec2', 'describe-instances', '--instance-ids', instance , '--output', 'table'])

Open in new window

Pls help with this. Trying to execute in above subprocess syntax. This one describe an aws instance based on user friendly tag name.
aws ec2 describe-instances --instance-ids $(aws ec2 describe-instances --filters  "Name=tag:Name,Values=testjumpbox" --query "Reservations[].Instances[].[InstanceId]" --output text | tr '\n' ' ') --output table

Open in new window

verbose info neededhow can I get more verbose information with cloud formation templates in AWS ?

for example below , how can I get more information on the circled item , how can I see a verbose log on this ?\

Hi Experts,

       I am working on an web application (wagtail cms) like django-cms, which is in aws ubuntu.  Its backend is postgres, with elastic search few modules reactjs and nodejs.
The application is deployed in docker containers.  uwsgi is the upstream server for nginx.

      I get lots of errors with npm build, i am not able to run the application.  sometimes ports are not accessible. postgres is using 5432, elasticsearch is 9200. react is rendering on 9009. I get lots of dependencies error.    At what situation I have to use dockers?  sometimes I get errors in nginx configuration. could please tell what is good practice for npm build, run deploy.  

    What is the best way to architect the development and test environment?

    Will I be able access the postgresql, elastic search and nginx in linux ubuntu on aws with cms source code and contents on windows local pc?   or having the copy of postgresql, elastic search, cms contents and nginx all on a windows pc as development environment environment.   Elastic search has huge data.  and how to deploy the application on dockers in linux environment.

    What is the best way to have a development environment?  what is best test environment

   could you please tell me on how elastic search data is stored on postgres and what is network host setting in elastic search how it accessed from outside application.

    Could you please throw light on how to create development and test environment.

with …
****While My Question was Pending I learned the Dell PS6510E Unit is not a server and is for storage but any ideas on how I can use this to my advantage*****

I am a Newbie in this department for sure. I have been running a Website on the AWS EC2 T2.Medium Ubuntu Linux for the last 2 years and I am rethinking some options. I have in our inventory a Dell PS6510E server that I came across and I have not tinkered with it yet.

Real Questions?????
Which would be faster and more reliable?
What is some insight or things I may be overlooking?
For AWS I know it is Sky's the limit on growing down the road but how far can the Dell PS6510E get me???

I am constantly import to my EC2 large amounts of files and this is eating up my CPU Credits and the time it takes is crazy.

I know every situation is different but i really need some guidance on if I should just stick with AWS or if this Dell PS6510E has real potential for growing or not.  Also the PS6510E is discontinued but I already have it. I thank you all for the help.

Here is the link to the Dell:
HI Experts,

Running an export utility from Oracle database 11g which is running on AWS Linux.

this process usually takes 45 minutes as per product. but it is taking almost 2 hrs.

Could you please help me identify the bottleneck, worst sql and give me some directions please?

Thanks in advance
Hi Experts,
Would like to achieve below using python function. Could anyone help please

1. Python script (a function/def) to accept an argument. E.g. Instance_id1 or Instance_id2 or Instance_id3.
2.  Script to locate relevant instance section in the ini file. (sample file below).
3. Get always_on_start_time, always_on_finish_time, and current system time.
4. Then identify current time is not within start/finish time.
3.  If within, start/finish time, return true, else false.

We will have to further execute few more task based on above return string.

Thanks in advance.

Sample ini file



Open in new window


We currently have intermittent DNS issues. We have a conditional forwarder to route traffic to AWS, however, it will randomly time out and after a few minutes it will come back.

We currently have 2 Windows Server 2012R2 Servers + a DNS server in AWS.

We have a conditional forwarder on our DCs that forward specific requests to the DNS server in AWS
Hi Experts,

How to connect to postgres, elastic search server in linux, ubuntu on AWS to  with Dango-cms or Wagtail cms from dev machine on windows

Please tell me how elastic search details are stored in database and how to connect to do a elastic search which is in postgres database.

All indexes of elastic search is stored in postgres sql.  How will i connect to postgres from windows dev machine.

And How can I access the postgres database in ubuntu aws to development machine have django-cms or wagtail cms on windows.


database connection string is created with dj_database_url  the connection string is DATABASE=postgres://user:password@server/database.

Please help me,

I get the following error when I build the wagatail cms - based on python
python makemigrations

throws the below error.

django.db.utils.OperationalError: could not connect to server: Connection timed out (0x0000274C/10060)
        Is the server running on host "" and accepting
        TCP/IP connections on port 5432?

I connect string sample postgres sample is postgres://test1:test2@

When I go inside the aws machine through putty (ssh) 54.*.*.*:22


telnet 5432

It is connection is success full.

but not able to connect from outside aws machine from a windows pc (development machine).

With Many Thanks,
Bharath AK
KuppingerCole Reviews AlgoSec in Executive Report
KuppingerCole Reviews AlgoSec in Executive Report

Leading analyst firm, KuppingerCole reviews AlgoSec's Security Policy Management Solution, and the security challenges faced by companies today in their Executive View report.

I have a folder in an S3 Bucket ( ppq-meta/sql ). Using powershell I want to copy everything in that S3 Bucket "ppq-meta" folder "sql" to a local drive z:\test\. This will be run on an EC2 instance, it already has AWS Powershell installed.  Names for S3 bucket and for folder are in lowercase, I remember it being case sensitive.

I know I can do this with aws-cli but don't want to install aws-cli on instance:
aws s3 cp s3://ppq-meta/sql z:\test --recursive

Open in new window

What I have tried in Powershell is below, but it keeps giving me errors.
Copy-S3Object -AccessKey Enter-Access-Key-Here -SecretKey Enter-Secret-Key-Here -BucketName ppq-meta -KeyPrefix sql -key "*.*" -LocalFolder z:\test

Open in new window


Copy-S3Object : A parameter cannot be found that matches parameter name 'KeyPrefix'.
At line:1 char:120
+ ... tName ppq-meta -KeyPrefix sql -key "*.*" -LocalFolder z:\test
+                    ~~~~~~~~~~
    + CategoryInfo          : InvalidArgument: (:) [Copy-S3Object], ParameterBindingException
    + FullyQualifiedErrorId : NamedParameterNotFound,Amazon.PowerShell.Cmdlets.S3.CopyS3ObjectCmdlet

Open in new window

What am I doing wrong?
how can I suppress this message when ssh into a linux server , i cant seem to find this like no check or something ,

ECDSA key fingerprint is MD5:de:66:6wea:32:dw2:65:d7wwwwwww:1c:a4:05:e0.
Are you sure you want to continue connecting (yes/no)? yes
how can I place a key in a AWS RHEL instacne with "user data"

is it possible to create a "here" script and have it execute on AWS instance "user data"?

like this

user data with a bash script in it#!/usr/bin/env bash
mkdir -p /home/ec2-user/.ssh
cat <<EOF >  /home/ec2-user/.ssh/docker-master-key.pem

I want to be able to pass a key to an instance upon boot, I have used just this above and it does not seem to do anything at all , I can not find any errors , how can I find out logging from this user data ?

AWS SSM "run " commnand" to pull down a file from s3 bucket and place it into a specific directory

I am attempting to run a shell script that copies  a file ( a pem.key)  from  an S3 bucket , to an AWSes 2 instance

I have the knowlege to get to the run command and run shell scripts no problem , but I have NO IDEA how to copy a file from S3 and place it into the

~/.ssh/ directory ,

the envental goal it to have an applicaion (ansible ) connect to each instance

I think I will first have to copy the PEM key and then that will allow me to copy the key to each instance

how can this be done , I am very new to this part of it

thanks !!
how can I ssh to a linux box and copy the key to the destination in one command

someone showed me this once it is l or ssh-L or somelike that
I have a company that has about 50 users , they all login locally with a local account to Windows 10 workstations .

There is not any infrastructure except the workstations in this company

They are interested in a domain controller and was you have it in AWS or AZURE with nothing else on the site as they would be using a FULL VM with a DC in AZURE or AWS .

Is this a good idea , it there latency issues ? or if the link goes down to AWS or azure will they be able to log in ?
Nowadays AWS cloud computing has a very scope. Amazon web services, JPA solutions provide is the good platform for learning database storage, content delivery and more learn from cloud industry experts. We have experienced and industry trainer to teach AWS from basic level to advanced level.

Expert Comment

by:Ancy Hollo
Share with you a good site that you can get cheap product keys from there:, all versions of office keys and office keys can be found in that site.
Hi Experts,

         I am using docker containers, in my development environment.  It is Ubuntu on AWS.  I had built docker images and run docker containers.

when I try "docker ps"  it displays all running docker containers.  But not able to see the logs and application is not harvesting data(elastic search) from the application.  I can see only few results there are nearly, 8 to 10 docker containers and I am able to see all the images running.  I am not able the see the logs for all the containers.  secondly, there are less results from the harvesters application.  

How will I enable the logs for the containers?   Please help me.

Kind Regards,

Bharath AK
I woke up this morning and my AWS Linux EC2 instance running Wordpress is showing just the Apache test page. I have Wordpress installed on this site and it's been up and running for about a week or so.

Here's a few things I've done to resolve this:

  • Restarted Apache
  • Confirmed MySQL is running
  • Restarted the EC2 Instance from the AWS Management Console

Anything I missed? I'd love to save my work and show it to the world.

The URL is

Thank you for your assistance.

Simple Misconfiguration =Network Vulnerability
Simple Misconfiguration =Network Vulnerability

In this technical webinar, AlgoSec will present several examples of common misconfigurations; including a basic device change, business application connectivity changes, and data center migrations. Learn best practices to protect your business from attack.

HI Experts,

Have an AWS Instance and I would like to manage them using AWS CLI.

So I was able to install, configure, and setup aws cli on linux and able to stop/start a particular instance using CLI.

my request here is
Since I have this as part of a CDCI Jenkins Pipeline...When starting an instance, would like to
  1. Ensure instance is fully up and running.
  2. Required services are up and running as well (e.g. jenkins, docker, etc)
  3. Before I trigger downstream job.

Could you pls help what is the best way to achieve this. I'm not restricted to AWS CLI, we can also try boto3 as well if we have more control over there.

thanks in advance
hi Experts,
Can someone help me to take mysql incremental backup using mysqldump? All my databases are on AWS RDS, so bin logging is not possible.
Hello AWS experts,

We have EC2 instances running applications in docker containers for which the auto-scaling part is managed bij ECS. It's all running internally in the VPC. So these EC2 instances or docker containers have no public internet-routeable IP addresses.

What we want to do is use Route 53 health checks to check certain TCP ports and send HTTP requests to the applications coming from various locations. The problem is of course that these probes on the various locations cannot get to services running internally on the VPC. What's the proper and easiest way to go to accomplish this?

We want to avoid setting up a monitoring service inside the VPC
We want it to be setup in a way that when we autoscale we don't have to change anything.

I've got stuck on one trivial thing. I've got an Ansible playbook:

- name: Build cluster
  hosts: localhost
  gather_facts: no
    - buildcluster

- name: Run software setup
  hosts: tag_Name_{{ cluster_name }}
  gather_facts: no
    - iso_pipeline_install

I'm setting the {{ cluster_name }} Tag on the Master node of cluster. My {{ cluster_name }} has dashes in it, but the inventory syntax accepts only underscores. And I don't know the IP address before I run the CloudFormation build. Is there any good practice to manipulate with the created server (I've got a cluster and I need to install software only on the Master Node)? Thank you in advance.
Hello Experts,

I have a java based web application running on EC2 instance. I have pointed our own domain name and can access the URL using the domain name.

I want to apply an SSL certificate to the web application. I have tried creating a certificate from ACM and configuring a classic load balancer (This is what Amazon recommends) However, I have lost access to my application when everything was done. I had to revert to the original settings. I still want to apply an SSL certificate.

Your help would be highly appreciated. Do I need to change something in my web servers configuration? I use Tomcat.

Thanks in advance.

how to convert xml to csv in aws by using boto3


Amazon Web Services (AWS), is a collection of remote computing services, also called web services, that make up a cloud-computing platform  operated from 11 geographical regions across the world. The most central and well-known of these services include Amazon Elastic Compute Cloud, also known as "EC2", and Amazon Simple Storage Service, also known as "S3". Other services include Elastic MapReduce (EMR), Route 53 (a DNS web service),  provides a highly available and scalable Domain Name System (DNS) web service, Virtual Private Cloud (VPC), storage, database, deployment and application services.

Vendor Experts

monday.comMonday Learn more about Monday