Application Development, Testing, Production

I've searched but haven't found anything so I thought I'd ask. In very broad terms I'm looking for information about setting up a test environment for my developers. They currently have all of the development tools on their computer and do some testing from there.

We recently setup a test web server (100% web apps) but they have been testing with the production database server.

We have plans to have production web and DB servers, as well as test web and DB servers. I'm really looking for the most common process of managing such environment. So here is what I'm thinking:

1. Developers create and manipulate apps on their computers to a point they feel they are ready for users to test the application.

2. Developers put their application in test (aka- pre-production) and let users test it. Working out any bugs along the way on their development computers and moving back to test as needed.

3. Once everything tested fine, move it to production.

I'm sure that's the normal process, now the major questions about it. How often do I move data back the other way. Our lead developer doesn't care about the data (sounds a little odd to me), they've expressed interest in periodically mirroring production back into test.

We're 100% virtual so I'm thinking just delete the test machines and clone the production machines. I will concern myself about duplicate computer names and IPs at another time. This to me doesn't sound like the most effective way. I was thinking of only moving the data from production to test but they've expressed that they don't care about the data...

Thoughts?

PS- I have a meeting in 45 minutes to begin discussions about test labs... Any help before that would be most helpful... :)
LVL 4
tpitch-ssemcAsked:
Who is Participating?
 
angus_young_acdcConnect With a Mentor Commented:
Bit late here too, but I will throw a little bit of my opinion in.

Personally, and I know several people who prefer this, I think that Developers should have a specific environment setup which they will use.  This could be a machine that they all remote into, or Virtual Machines which you can clone.

The reason behind this is to keep everything the same!  A common issue I have found is a developer will load something into Test, then when it doesn't work they will say "but it works on my machine"... which is fine, and true, but it is hard to replicate why it works on their machine when there are potentially endless variations of software types & versions.

That way if all the developers are using the exactly same software and specification machines, in a development environment, then it should limit any potential issues with that happening.
0
 
hlapradeConnect With a Mentor Commented:
Sorry tpitch-ssemc, I think I am late, but I can tell you that developers in general do not care about fresh data, it is odd but it more related with no understanding the entire company model.
Personally what I do is log shipping and it does work for me like a charmed for me. I know my developers hates me but they keep version control of database objects in script files, that way you can definitely see changes done really quick, to keep db versioning.

Luke
0
 
tpitch-ssemcAuthor Commented:
Thanks. I think I know how to proceed, just need to find the right procedures to implement.

Thanks again!
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.