AWS

Amazon Web Services (AWS), is a collection of remote computing services, also called web services, that make up a cloud-computing platform  operated from 11 geographical regions across the world. The most central and well-known of these services include Amazon Elastic Compute Cloud, also known as "EC2", and Amazon Simple Storage Service, also known as "S3". Other services include Elastic MapReduce (EMR), Route 53 (a DNS web service),  provides a highly available and scalable Domain Name System (DNS) web service, Virtual Private Cloud (VPC), storage, database, deployment and application services.

Share tech news, updates, or what's on your mind.

Sign up to Post

Trying to make a LAMP on AWS.

Typed in sudo apt-get install mysql-server and "it" did it's thing ending at a prompt.

I then typed in sudo service mysql start... again it did it's thing.

I'm assuming that MySQL started (no errors were thrown) and thus was installed. During the install I was never prompted to create a password.

Is there a default password?

Typing in mysqladmin -u root -p status prompts for a password.
0
Become a Microsoft Certified Solutions Expert
LVL 13
Become a Microsoft Certified Solutions Expert

This course teaches how to install and configure Windows Server 2012 R2.  It is the first step on your path to becoming a Microsoft Certified Solutions Expert (MCSE).

I'm using aws elasticbeanstalk with an nginx proxy server to my application. I've got it setup on port 5000. I am getting a "took too long to respond HTTP ERROR 504", however, it occurs instantly.

Locally on the server, I can curl http://localhost:3000/login and it returns the html of the login page.

Here are my nginx.conf

# Elastic Beanstalk Nginx Configuration File
user                    nginx;
error_log               /var/log/nginx/error.log debug;
pid                     /var/run/nginx.pid;
worker_processes        auto;
worker_rlimit_nofile    200000;

events {
    worker_connections  1024;
}

http {
    include       /etc/nginx/mime.types;
    default_type  application/octet-stream;

    log_format  main  '$remote_addr - $remote_user [$time_local] "$request" '
                      '$status $body_bytes_sent "$http_referer" '
                      '"$http_user_agent" "$http_x_forwarded_for"';

    include       conf.d/*.conf;

    map $http_upgrade $connection_upgrade {
        default     "upgrade";
    }

    server {
        listen        80 default_server;
        access_log    /var/log/nginx/access.log main;

        client_header_timeout 60;
        client_body_timeout   60;
        keepalive_timeout     60;
        gzip                  off;
        gzip_comp_level       4;
        gzip_types text/plain text/css application/json application/javascript application/x-javascript text/xml application/xml application/xml+rss text/javascript;

       

Open in new window

0
Bear with me, I am a noob on AWS.  I am wanting to properly secure an RDS instance by following the best practices and putting it on a subnet group that does not have a gateway to the internet.  The methods described to manage the instance involve using a webserver in a subnet that does have a gateway to create an SSH tunnel to it. All well and good until the design is to use S3 as a webserver, and an API to call a Lambda function that actually does something. Not having an EC2 instance running, I am unable to find a way to manage and restore data to the RDS instance once it is placed in the secure subnet.  
As a work around, I have created an ACL that whitelists my office IP for port 1433 and still have the RDS instance attached to the default VPC.
0
We have 2TB of archived files that we want to move to Cloud storage. The current 2 TB of files will increase as we archive more data, so for now we would need 2.5 to 3 TB of space.

We will need users to be able to access the files (read only) as needed and be able to download from the archive as needed.  

Azure file storage is to expensive at this time.  

Do not want to use AWS because we are using Azure for Cloud backup and will eventually use Azure for file storage.

Other than Dropbox, can anyone recommend another Cloud storage solution for storing these files.

Thanks,
cja
0
AWS, Speech Recognition and AWS

I have heard AWS has an excellent Speech Recognition web API, and wonder if anyone knows of a complete speech recognition app which uses this AWS API?

Apparently, the demo emails you the translation and I hear the translation is great.

Anybody got any leans on an app that can handle speech to text?

Thanks.
0
hi guys

I have an EC2 AWS instance running apache. We previously bought SSL certificates and had them installed. They have now expired. We renewed them with Godaddy.

I want to install them on the server, but I can't seem to find the location where they need to go. One of our techies who has left may have played with the http.conf file but I am unable to work this out.

Can someone give advice on how to work on this?

Thank you
Yash
0
I've got a Python script that traverses S3 Buckets and prints out what folders and files have public permissions. This can be handy when auditing AWS for potential security issues.

Right now, the script runs fine, but takes a long time to run, due to a CDN that has known public permissions. I would like to exclude that bucket when I run the script.

Can someone please help me create a line in the script that allows me to EXCLUDE a particular bucket? Let's call the bucket I want to exclude "cdn-twt" for the sake of this script.

Thanks in advance for your assistance.

#This Script will use Paginator to print result for each bucket, executed in multiple threads
import boto3
import threading
import os.path

ACCESS_KEY = 'A*****************A'
SECRET_ACCESS_KEY = 'P******************************2'

session = boto3.Session(aws_access_key_id = ACCESS_KEY, aws_secret_access_key = SECRET_ACCESS_KEY)

maxthreads = 5
sema = threading.Semaphore(value=maxthreads)

def list_object(bucket):
    try:
        s3 = session.client('s3')
        flag1 = objcount = 0
        paginator = s3.get_paginator('list_objects')
        page_iterator = paginator.paginate(Bucket= bucket)
        for page in page_iterator:
            if 'Contents' in page:
                for obj in page['Contents']:
                    uniobj = obj['Key'].encode('ascii', 'ignore').decode('ascii')
                    objAcl = s3.get_object_acl(Bucket=bucket, Key=obj['Key'])
                    flag2 = 0
   

Open in new window

0
Does anyone out there know / understand how Glacier generates the requests when retrieving a file from Glacier?
I have only been able to find the estimate of how much per GB  and how much per 1000 requests.
I'd like to know how many requests a file may invoke - can't really find anything at AWS
0
Hi Experts - my inquiry boils down to three questions:  

1.). What might a Security Analyst have found in AWS instances (e.g., any hosted database or server solutions) to indicate TOR usage?

2.). How do you effectively detect and contain/eradicate TOR usage?

3.). What are sure tell-tale behaviorism's that TOR is being used on your network?

Thanks a million in advance for any insight provided!
0
Hi,


We have emails send from Amzon server (From our domain name).

My question what is: Amazon SES Domain Verification TXT Records?

Is it an SPF record or it not the same but just another verification process?

Thanks
0
Introduction to Web Design
LVL 13
Introduction to Web Design

Develop a strong foundation and understanding of web design by learning HTML, CSS, and additional tools to help you develop your own website.

I have the following db string  inside of my web app. The localhost work but the live website does not work. do you know how to fix it?
Both DB/IIS is in the same AWS instance.

 <add name="ConnectionString" connectionString="Data Source=WIN-GB8M3MM6asdfasdT2M\SQLEXPRESS;Initial Catalog=Pwr;Integrated Security=True;"
      providerName="System.Data.SqlClient" />
0
We have one server using SSL that purchased on GoDaddy and our domain is www.xyz.com.
Now, we have another server hosting on AWS, and it is still www.xyz.com. We just use it as fail over purpose.
for example, if our main office's power is off, we will switch www.xyz.com's ip to AWS on DNS make easy.

My question is: I want to install SSL on AWS IIS. Where should I get SSL? On Godaddy.com or I can extract from office's IIS?

Thanks
0
Hi Experts,

I have downloaded and installed Amazon WorkDocs Drive on my Windows 10 PC.

To get WorkDocs Drive running, I am being asked to log in with my d-12345eb0e credentials, consisting of a Username and Password.
Where do I get or set the Username and Password?
I've tried using my AWS Console credentials, but that does not work.

Regards,
Leigh
0
Hi
I think I mucked up a decision a few weeks back.
A sparkly girl from Godad called me one morning during a remorseless coding buzz of mine.
She said they had reviewed my godaddy account and were uncertain about my server decisions.
Apparently, they had a new package deal for a "non-VPS" server of my own to handle my games' (plural) needs.
I didn't sense any red flags, so listened on.
I told her I needed a server to handle my games'  traffic and deliver appropriate responses and content to the user.
She didn't object. and told me it would be my own dedicated machine.  - And could do what I needed.
I was admittedly skeptical, bec if this was a special deal for a personal server, they'd need quite a big farm for all the potential global individual clients?
I should have asked her where the location was and how many clients had servers.
Anyway, she said it was a short time offer, and I thought it was a good deal for my needs and I bought the damn thing at $1,000
Can I salvage this blunder? Was it a blunder?
Can I work with the thing the sold me to make phone games?
I have since discovered Amazon web services which would be far better suited to my needs.- only pay for traffic you use.

Does Godaddy have a similar system to AWS? Is it in development?
I can demand a refund for the blunder, and use that money for Godaddy Web Services? Godaddy refunds have to stay within the account.

Thanks
0
I have a legacy Windows app I've installed on a Windows Server 2019 EC2 instance.  I want to let a client use it over RDP (well, through a web browser and RDS).

When I set up RDS it tells me I need a CAL per user.  The AWS docs seems to say that EC2 server instances are pre-licenced for 2 admin users by RDP plus unlimited clients (presumably of the web server or similar).

Can I let the client use just my Windows application on the machine without further licences?  If so, how do I tell RDS that it's legitimate use, or if not, what's the least cost solution to setting this up legitimately?

Simon.
0
Since changing system options on the server can cause serious issues if not done properly, I prefer to check with you before making changes.

Below is the AWS EC2 linux httpd.conf file as it is now.  

1. Is the .htaccess located within the httpd.conf file, or are the controls for the hidden .htaccess file contained within httpd.conf?

2. Which lines would I have to change in order to add ..
<Files *.php>
	Require all denied
	Require local
</Files>

Open in new window

.. and where would I add them?

3. I understand that I have to restart the instance. Since there is no 'restart' option in the management console, does this mean 'Start' the EC2 instance state, or 'Reboot' the running instance?

Thank you

#
# Deny access to the entirety of your server's filesystem. You must
# explicitly permit access to web content directories in other
# <Directory> blocks below.
#
<Directory />
    AllowOverride none
    Require all denied
</Directory>

#
# Note that from this point forward you must specifically allow
# particular features to be enabled - so if something's not working as
# you might expect, make sure that you have specifically enabled it
# below.
#

#
# DocumentRoot: The directory out of which you will serve your
# documents. By default, all requests are taken from this directory, but
# symbolic links and aliases may be used to point to other locations.
#
DocumentRoot "/var/www/html"

#
# Relax access to content within /var/www.
#
<Directory "/var/www">
    

Open in new window

0
Hi,



I have a Synology DS918+ and a backup to a disk and to another Synology Disk Station (Hyper Backup). However, I noticed the backup to the other Disktation didn't happen for quite some time.
How can I make sure backup is done online, easies way and for the least price or even free (I'm backing up max. about 4 TB of which little changes).
Do I use Azure, Glacier, OneDrive, other?

Note: is there a way to detect ransomware (cryptolocker) in time?



J
0
If I hit the site with the direct https://site.com it works fine but I'm trying to have http forward to https so that the site can only be viewed securely. My issue is, the redirect I used in the screenshot had no affect on forcing to https. It looks so easy yet I'm not sure what else other than making sure https is configured correctly and the redirect on the 80 listener is configured.


    Add/Edit your HTTP:80
    listener

    Set the action to Redirect

    protocol: https

    port: 443

    set the next dropdown to Original host, path, query

    set the last dropdown to 301 - Permanently moved


ssl_awsredirect.png

There are so many articles on how to do this but it seems like this is the recent preferred method to handle the redirect.
0
Hello All,

A project has come up to on demand scan a uploaded document for viruses in our site.
 
Some info on our site:
C#, MVC system with razor templates and xml configurations for custom UI.
AWS hosted web server and db.
Currently the users can attach documents to various items in our system.  Those documents are stored as varbinary in the db tables.


It was suggested to me to look into Clamav.net, but I'm wondering what the best solution (tool) other than Clamav, would be for scanning of a file that goes directly into a db table and not a folder structure on the server?  If it is necessary to save the file in a folder structure in order to do the scanning then we can change the system to do that.

Thank you for your time and suggestions.

Sincerely,
Ex
0
Starting with Angular 5
LVL 13
Starting with Angular 5

Learn the essential features and functions of the popular JavaScript framework for building mobile, desktop and web applications.

Hi:
I use AWS linux and PHP 7.0.27
I want to make access to my scripts more secure to prevent a hacker from adding to the URL to get access to a script directly. I read that putting scripts into a private, 'inaccessible from the browser' directory, then no access using the browser would be possible. Access would only be by PHP.

The directory structure ..

/var/www/html/index.php (public)      root:root

/var/www/private/script1.php (private)            root:apache

If I code an include script in index.php and place the include script in private, the include script is accessed as expected.

However, when I want to access a non-include script (script1.php) from index.php with ..

$url = "./private/script1.php";
ob_end_clean();
header("Location: $url");
exit();

the browser throws the error ..
'The requested URL /private/script1.php was not found on this server.'

I guess this is because index.php is in the public directory for access to/from the browser, and script1.php is in the disallowed private directory,

In this scenario, how can I code PHP to make index.php redirect to script1.php?
0
I've got a client who's running Bitnami Wordpress using AWS Lightsail.

When I go to the page, it's completely white. I thought this was pretty unusual behavior so I did some Google-Fu and found a known permissions issues that has to do with Group and Ownership.

This requires adjusting the ownership of the wp-config.php file, a normal file that should be located at the root of Wordpress.

It's not.

Which is frustrating.

When I locate it using this command: locate wp-config.php

I find it in a swap file form: e.g. 2COMPANYNAME.com/.wp-config.php.swp

at

/opt/bitnami/apache2/var/cache/mod_pagespeed/v2/COMPANYNAME.com/http

How can I take that ".wp-config.php.swp" and move it to "~/apps/wordpress/htdocs", which is where index.php is for some reason and where I expected to find the "wp-config.php" file.

Thanks for your help.

Thom


...
0
I have built an nginx development system on an AWS Linux instance.  I FTP'd all of the files from a working site that is Apache based.  Two strange things have happened that are causing me to not be able to update or delete the plugins.

1.  When I zipped my database on the source system I then FTP'd them to the target system and unzipped them.  When I go into the dashboard I am unable to update nor can I delete them either.
2. Also, the table prefix went from wp_ to wpstg0 could this be affecting the plugins?

I need to get this site up and running so thanks for your help.

Randal
0
I need to install Mariadb on an Linux 2 AWS Lemp stacik.  Except, I keep getting the following dependency error:

[root@ip-172-31-34-149 etc]# yum install -y MariaDB-server MariaDB-client
Loaded plugins: priorities, update-motd, upgrade-helper
Repository mariadb is listed more than once in the configuration
2 packages excluded due to repository priority protections
Resolving Dependencies
--> Running transaction check
---> Package MariaDB-client.x86_64 0:10.3.17-1.el7.centos will be installed
--> Processing Dependency: MariaDB-common for package: MariaDB-client-10.3.17-1.el7.centos.x86_64
--> Processing Dependency: libsystemd.so.0()(64bit) for package: MariaDB-client-10.3.17-1.el7.centos.x86_64
---> Package MariaDB-server.x86_64 0:10.3.17-1.el7.centos will be installed
--> Processing Dependency: libsepol >= 2.5-6.el7 for package: MariaDB-server-10.3.17-1.el7.centos.x86_64
--> Processing Dependency: perl(DBI) for package: MariaDB-server-10.3.17-1.el7.centos.x86_64
--> Processing Dependency: galera for package: MariaDB-server-10.3.17-1.el7.centos.x86_64
--> Processing Dependency: libsystemd.so.0(LIBSYSTEMD_209)(64bit) for package: MariaDB-server-10.3.17-1.el7.centos.x86_64
--> Processing Dependency: perl(Data::Dumper) for package: MariaDB-server-10.3.17-1.el7.centos.x86_64
--> Processing Dependency: libsystemd.so.0()(64bit) for package: MariaDB-server-10.3.17-1.el7.centos.x86_64
--> Running transaction check
---> Package MariaDB-client.x86_64 0:10.3.17-1.el7.centos will 

Open in new window

0
Need help with AWS project - a website to sell mobile insurance. Need help with AWS Lambda Function development.
0
Hi,  I have a client that has an on prem exchange server and is considering options to move to Azure or AWS.  I've read conflicting reports online where people are claiming MS doesn't support Exchange on cloud VMs.  Does anyone know if this is the case or not?  

The client has a third-party app that possibly hinders moving to O365 (asking about this in a separate question.)
0

AWS

Amazon Web Services (AWS), is a collection of remote computing services, also called web services, that make up a cloud-computing platform  operated from 11 geographical regions across the world. The most central and well-known of these services include Amazon Elastic Compute Cloud, also known as "EC2", and Amazon Simple Storage Service, also known as "S3". Other services include Elastic MapReduce (EMR), Route 53 (a DNS web service),  provides a highly available and scalable Domain Name System (DNS) web service, Virtual Private Cloud (VPC), storage, database, deployment and application services.