Docker is a computer program used to run software packages called containers in an operating-system-level virtualization process called containerization. It’s developed by Docker, Inc. and was first released in 2013.

Hi I am looking for some integration from jenkins to sonarqube scanner and similarly other tools like ansible, artifactory, Fortify on Demand etc. has any one worked or used these Tools integrations without actually installing plugins on Jenkins and integrated these tools with sonar scanner / cli for e.g: or else used any rest api to talk to these tools from Jenkins. If so could you please give a shout and if possible share any code references / links to demonstrate this and I will try get to you.

Out of all tools integration mentioned in question, priority is that i am looking at Jenkins - SonarQube integration with out using SonarQube Plugin on Jenkins.

I will be looking to set this up on Linux within a Docker image. Installing directly on CentOS is also an option.

Many Thx
I am trying to launch chromium-browser on the startup of the Ubuntu:ionic docker container, but encountering below error.

(chromium-browser:1570): Gtk-WARNING **: 15:36:04.321: cannot open display:

Can someone help me with this?
This one is really getting to me. I would like to run most of my docker containers by connecting them to a vlan on my Ubiquiti network but cant seem to get it working. although admittedly I don't know much about docker and I'm still very new to it.

I have tried creating a macvlan for one of the networks and that seems to be alright but I cant seem to get my containers connected to it.

I would like to connect to my vlan at just doing that with an nginx container or something would be a great example for me to go by.

Thanks in advance for your help.

I have a question on Apache Airflow. We currently have an apache config running in AWS with one core (master)-node and 5 worker nodes. All running in a docker-container set up and managed by an ECS Service and an EC2 auto-scaling group. Right now all the workers are m5.xlarge.

According to the developers the reason why they all have to be m5.xlarge is because there is one job that has a dataset that would otherwise not fit in the memory of a single instance. But the majority of the jobs are small and don't need a lot of resources. So the 5 instances are basically idle most of the time.

I know little or nothing about Apache Airflow. My questions specifically about this setup (Airflow in docker on ECS) are:

1. Does Airflow by default supports a "fleet" of different instance sizes and can it then based on the job type (or other identification) sends these specific jobs to a certain type of worker?

2. Can the worker nodes with a default Airflow setup be spot instances? In other words when a worked dies would Airflow pick up the job again and re-run it or doesn't it have an idea of the state of a job?

3. Is Airflow aware of how many workers there are and what jobs are running where? Is there any way from within airflow to see what jobs are running and how much resources they are using?

4. I see a lot of Airflow core nodes with a very flat CPU utilisation line which seems strange to me and possibly a process that is in some kind of loop. What is the best way to …
Dear users,
I have a big trouble.

My configuration is: Win 10 Pro; Virtual Box 6.0; in Virtual Box, Linux Mint Cinnamon 19.2; and in Linux I have Docker 19.03.5.

From my home connection all works fine and also Docker; I can, i.e., make a "search" without problems. At my work I have a proxy and proxy is well configured (Internet is available and also the browser works fine); for Docker proxy configuration, I have done exactly what is described in this official page but Docker doesn't work. If I try a sudo docker search, I see a message:

Error response from daemon: Get dial tcp i/o timeout

How can I fix the problem? Is very important, because without Docker I can't work at a foundamental part of the project. Thank you very much, if you need some additional info or logs let me know and I will send you in a minute.

EDIT: This is another response, if I try to run something:

user@User-VirtualBox-Mint:/$ sudo docker run hello-world
[sudo] password for user:        
Unable to find image 'hello-world:latest' locally
docker: Error response from daemon: Get net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers).
See 'docker run --help'.
I have 3 independent systems and I want to integrate them together so when a user register in systemX, it is registered on other systems, When login in SYstemX ihe is loggedin in other systems, and so on, so I am searching for something like system integrator (Open Source)
Any idea of such systems?
Note: I read about docker, kubeneters, vargant, microservices, ESB, iSaas, etc..., but totally lost
I am looking to switch devops-Cloud career from Oracle-Admin career

Kindly advice
- how to define learning path
- what are tools/softwares
- good online paid course
Hello, I have built a .net solution which you can see here on github. The solution contains a web api project which is based on mongodb and uses easynetq. It also contains a console application which has a reference to the web api and also uses easynetq.
I was asked to deliver this solution wich a docker compose command, however I faced a few issues with the dockerfiles and the docker compose file, hence this answer. Whenever I add the Visual Studio support for Docker in my projects, then the solution fails to run....among other problems. Can you please provide me the correct docker files for this scenario?

I'm trying to install Docker on Linux RedHat RHEL 7.2 server and facing a lot of issues like absence of libraries. Also I'm having a lot of issues with the SE Linux.

What are the prerequisites for this ? which kernel? You can see the error below:

root@su14692# sudo yum install docker-ce docker-ce-cli
Loaded plugins: langpacks, product-id, search-disabled-repos, subscription-manager
This system is not registered to Red Hat Subscription Management. You can use subscription-manager to register.
Repository docker-ce-stable is listed more than once in the configuration
Repository docker-ce-stable-debuginfo is listed more than once in the configuration
Repository docker-ce-stable-source is listed more than once in the configuration
Repository docker-ce-edge is listed more than once in the configuration
Repository docker-ce-edge-debuginfo is listed more than once in the configuration
Repository docker-ce-edge-source is listed more than once in the configuration
Repository docker-ce-test is listed more than once in the configuration
Repository docker-ce-test-debuginfo is listed more than once in the configuration
Repository docker-ce-test-source is listed more than once in the configuration
Repository docker-ce-nightly is listed more than once in the configuration
Repository docker-ce-nightly-debuginfo is listed more than once in the configuration
Repository docker-ce-nightly-source is listed more than once in the configuration

Open in new window

hello, before all excuse for the newbie question: I'm totally new to Docker and to Attendize.
I've installed The open source software Attendize in a Docker environment successfully following these instructions:

and soon after the installation I was successfully able to connect to localhost and testing the application.

But at the restart of the PC I was not able to connect to localhost again: maybe that I should "restart" something on Docker?

I've Ubuntu 16.04,  Docker 18.09.6, Docker-compose 1.23.2

how to move data from sonarqube (7.4) on postgres standalone server, to other ubuntu server on docker sonar with postgres?
What i did:
i install sonar with postgres on docker sonar 7.7-community and postgres 11.3
then i take dump of old sonar and restore on new postgres.
it's not work.
my nginx container can not see other containers.
i start nginx docker as
docker pull nginx and create my own image and copy my nginx.conf to /etc/nginx/nginx.conf
my docker-compose for nginx is
version: '3'
    image: ssh-nginx:2.0
    container_name: my-nginx
      - '80:80'
      - '443:443'
      -  '8080'
      - /containers/nginx/src:/usr/share/nginx/html
      - /containers/nginx/src/site-enable:/var/www
i have jenkins running in same server and build.conf is in volume /var/www/
build.conf is

server {
   listen       80;
   server_name  "";

   access_log off;

   location / {
       proxy_pass         http://build:8080;

       proxy_set_header   Host             $host;
       proxy_set_header   X-Real-IP        $remote_addr;
       proxy_set_header   X-Forwarded-For  $proxy_add_x_forwarded_for;
       proxy_set_header   X-Forwarded-Proto http;
       proxy_max_temp_file_size 0;

       proxy_connect_timeout      150;
       proxy_send_timeout         100;
       proxy_read_timeout         100;

       proxy_buffer_size          8k;
       proxy_buffers              4 32k;
       proxy_busy_buffers_size    64k;
       proxy_temp_file_write_size 64k;    



index.html in nginx is ok and i can see my page from index.html.
Problem  is  "I CAN NOT SEE JENKINS BUILD" browser says page not found....
inotifywait -mrq /etc | grep …
I try to run sonarqube 7.7. on docker with postgresql database.
i running artifactory on postgresql and i using default port 5432.
When i try sonarqube and in my docker compose change port to 5430 web server want to start.
how to use same docker postgresql for both artifactory and sonar.
i mean how from compose-file tell sonar to use existing container (of course i login to postgres and create a database for sonar)
Deat Experts
is it a good practice to run docker for MariaDB/MySQL in a production environment for 10000 users?

please suggest!!!
is there any way to create master slave for postgresql when slave is on docker installation ?
xcp-ng center show nothing under management continer.
i have a coreos and 2 conrainers running in this os, but i can't see containers under container management on xcp-ng center (xen)
i have latest coreos and latest cxp-ng server "7.6".
I have a nginx with many sites_enabled file how to configure if i move it to docker nginx?
Hi Expert

I have encountered issued during the deployment of docker image  when head over to the URL with port

Please note i am using instance in the AWS.
How to write application event logs in individual (not centralised) docker container for windows that has been deployed in Azure service fabric via AWS cicd . what configuration is needed if any ? if it is being logged then  what will be the file path.  ?

I am having a problem with Docker CE  (Docker version 18.09.1, build 4c52b90)

on Centos 7.  The problem is with MACVLAN Bridging I am trying to configure a MACVLAN bridge with a exclusion of using the following code.

docker network create -d macvlan \
  --subnet= \
  --gateway=  \
      --aux-address="router=" \
   -o parent=enp0s3 pub_net

I get the error:  1Error response from daemon: Pool overlaps with other one on this address space
or no matching subnet for aux-address

When this is sorted out I would like to run this command:

docker run -p 8000:443/tcp  --rm -itd   --network pub_net   --name my_pub_net-alpine   alpine:latest   ash

does anyone know how to fix this problem or give me a workaround so I can forward ports for Nginx.

Kind regards

Can anyone provide me with a link to how to uninstall 'Docker Compose' from Windows server 2016 ?

There is plenty out there about how to uninstall Docker from Win 2016 and RHEL but nothing I can find that is specifically about 'Docker Compose' and Win 2016.
I managed to run docker and install wordpress on ubuntu linux but can't seem to get the handle how I can edit the files within the dock as I get permission issues.
I think I am looking at it the wrong way about,
Could somone get me thinking the right way because  I love the performance for local development :).
(PHP ,Wordpress,MYSQL on NGINX).

why can't run the docker container with error for yr advice. Tks.


Anyone can advice how to retrieve accurate information to determine the overhead and size the hardware requirement (CPU, RAM, Storage, Network) by calculation for docker container APPs like elastic LB, fault tolerance which running on Kuberbnetes orchestration layer to design 500000 live video feed coming to share the load on the bare-metal design?

I'm just starting to learn Docker, and I think I have some of the basic concepts of creating containers down.  My intention is to have multiple containers on my server, each serving one unique website.

Now, here's my question.  I don't know how to handle the ports if there are multiple containers all set to respond to port 80.  Won't it cause some sort of problem if there are multiple containers, each running their own instance of apache, each reacting to port 80?  Is there some sort of internal IP addressing then that needs to take place to handle that?

I've got a pretty decent idea how the routing/responding through Apache works on a single server - but isn't this conceptually multiple servers all tied together with the same IP?






Docker is a computer program used to run software packages called containers in an operating-system-level virtualization process called containerization. It’s developed by Docker, Inc. and was first released in 2013.