Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
?
Solved

Way to make a program faster by using more processor

Posted on 2006-06-02
37
Medium Priority
?
588 Views
Last Modified: 2010-04-07
I have a vb app that under load uses about 8% processor and never goes over 11MB of memory usage.  I would like the program to use more resources so that
it completes it's tasks faster.  What can be done to get more performance out of it?
0
Comment
Question by:John Gates, CISSP
  • 14
  • 10
  • 10
  • +2
37 Comments
 
LVL 29

Expert Comment

by:nffvrxqgrcfqvvc
ID: 16816296
If you mean its already compiled than not much, you would have to open task manager and right click the process and choose Set Priority/High

If you want to this by code than use API
http://msdn.microsoft.com/library/en-us/dllproc/base/setthreadpriority.asp
0
 
LVL 26

Expert Comment

by:EDDYKT
ID: 16816638
Don't accept this comment as answer

to do it in VB

Const THREAD_BASE_PRIORITY_IDLE = -15
Const THREAD_BASE_PRIORITY_LOWRT = 15
Const THREAD_BASE_PRIORITY_MIN = -2
Const THREAD_BASE_PRIORITY_MAX = 2
Const THREAD_PRIORITY_LOWEST = THREAD_BASE_PRIORITY_MIN
Const THREAD_PRIORITY_HIGHEST = THREAD_BASE_PRIORITY_MAX
Const THREAD_PRIORITY_BELOW_NORMAL = (THREAD_PRIORITY_LOWEST + 1)
Const THREAD_PRIORITY_ABOVE_NORMAL = (THREAD_PRIORITY_HIGHEST - 1)
Const THREAD_PRIORITY_IDLE = THREAD_BASE_PRIORITY_IDLE
Const THREAD_PRIORITY_NORMAL = 0
Const THREAD_PRIORITY_TIME_CRITICAL = THREAD_BASE_PRIORITY_LOWRT
Const HIGH_PRIORITY_CLASS = &H80
Const IDLE_PRIORITY_CLASS = &H40
Const NORMAL_PRIORITY_CLASS = &H20
Const REALTIME_PRIORITY_CLASS = &H100
Private Declare Function SetThreadPriority Lib "kernel32" (ByVal hThread As Long, ByVal nPriority As Long) As Long
Private Declare Function SetPriorityClass Lib "kernel32" (ByVal hProcess As Long, ByVal dwPriorityClass As Long) As Long
Private Declare Function GetThreadPriority Lib "kernel32" (ByVal hThread As Long) As Long
Private Declare Function GetPriorityClass Lib "kernel32" (ByVal hProcess As Long) As Long
Private Declare Function GetCurrentThread Lib "kernel32" () As Long
Private Declare Function GetCurrentProcess Lib "kernel32" () As Long
Private Sub Form_Load()
    'KPD-Team 2000
    'URL: http://www.allapi.net/
    'E-Mail: KPDTeam@Allapi.net
    Dim hThread As Long, hProcess As Long
    'retrieve the current thread and process
    hThread = GetCurrentThread
    hProcess = GetCurrentProcess
    'set the new thread priority to "lowest"
    SetThreadPriority hThread, THREAD_PRIORITY_LOWEST
    'set the new priority class to "idle"
    SetPriorityClass hProcess, IDLE_PRIORITY_CLASS
    'print some results
    Me.AutoRedraw = True
    Me.Print "Current Thread Priority:" + Str$(GetThreadPriority(hThread))
    Me.Print "Current Priority Class:" + Str$(GetPriorityClass(hProcess))
End Sub
0
 
LVL 4

Expert Comment

by:NicoLaan
ID: 16816759
I think it highly depends on your task.
Obvious example: if the task is to copy files from A to B, it depends on the harddisk, unless A or B is network then it (also) depends on the network card and network traffic.

Can you tell us what your program does?
0
VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

 
LVL 4

Expert Comment

by:NicoLaan
ID: 16816794
If the problem is that other processes eat to much CPU time and you want your VB-app to be the boss, the 1st to comments are at least on the right track (I think).
Otherwise the problem becomes more complex and might not be solved using code, but maybe using other hardware, or maybe completely rewrite the code to better use the available resources.
0
 
LVL 18

Author Comment

by:John Gates, CISSP
ID: 16816833
THe program does do file access, compares entries to a SQL database and posts results.
0
 
LVL 4

Expert Comment

by:NicoLaan
ID: 16817047
When I'd written the app, I might maybe have some good ideas to improve speed but standing here it's guess work, so I'm just shooting some ideas.

Basically, you need to find the bottlenecks and improve their speed.

Possible bottlenecks and simple suggestion to improve:
slow / bad queries --> improve them (can be an art in itself)
too much work on the client and too little on the SQL server causing too much network traffic. --> change queries to run / calculate more on the server
heavy used database server --> change to new database server, add memory to server, add CPU
heavy network load --> change network speed (i.e. to Gigabit) or do it when no others are using it like in the evening
harddisk bottleneck --> Change new or extra harddisk, maybe scsi, maybe 10k revolutions harddisk whatever

But I think YOU need to get some idea where the problem lies, unless I seriously look into the actual code, which I don't think is a great idea (time and such :-).
What do YOU think where the problem lies?
Any clues?
A tool to get some more ideas where the problem lies is the performance monitor under administrative tools in winNT, 2000, XP and so on.
I hardly ever used it so can't give many tips in that area but it might give you some ideas.

Some other tests to do to get an idea:
- Test it on a machine with a better faster harddisk.
- Test if the problem is SQL related by not actually making a query but just create a dummy result set on your computer.
- Test the performance when the network is not being used by others (evening).
If any of these tests give a higher usage than 8% you're on the right track.
0
 
LVL 46

Expert Comment

by:aikimark
ID: 16842601
Very good comments NicoLaan.

==========================
@dimante

<<I would like the program to use more resources so that
it completes it's tasks faster.  What can be done to get more performance out of it?>>
You are I/O bound.  Using more CPU will require you to multi-thread your processes in order to perform the database comparison in parallel.  Your priority probably makes little difference, unless it is low compared to some other CPU hog.

Streamlining the process really depends on your algorithm.  For instance, if you are processing multiple files with a simgle process, you will need to have very efficient file I/O and (potentially) combine several database calls into one.

Your eventual solution may depend on how you measure "more performance".  
Is it
* Number of files processed per unit of time
* Elapsed time to process all files
* Elapsed time to process each file (usual stats) normalized to the size of the file or the number of validations in the file.
* Elapsed time to validate per unit of time (usual stats).
* User perception of performance.
0
 
LVL 4

Expert Comment

by:NicoLaan
ID: 16847005
thank you aikimark
Hadn't thought about threading, but is also a great idea to get things faster.
0
 
LVL 18

Author Comment

by:John Gates, CISSP
ID: 16847471
Threading sounds good but do you have an example of it's implementation?

0
 
LVL 46

Expert Comment

by:aikimark
ID: 16847715
@dimante

Multi-threading isn't a trivial task.  Since we don't have very much information about your application's details, I'm afraid we might give you advice which would be valid, but would lead you into a terrible mess when you tried to implement it.

To understand multi-threading, think about starting 2, or more, copies of your application with Start commands in a .CMD file.  Each Start command would supply a different command-line parameter to your application (or start in a different directory with different control files).  Each copy of the application program would process a different subset of the files in parallel.  Thus, the elapsed time to process the files would theoretically be 1/N the time you experience now, where N is the number of simultaneous invocations.  

However, this isn't a clear solution as there may be a limit to the number of database connections that some of your programs encounter or some other bottleneck already mentioned by NicoLaan.

A more complicated configuration requires you spawn multiple threads from a single program and coordinate their activities...not an undertaking for a beginner.
0
 
LVL 18

Author Comment

by:John Gates, CISSP
ID: 16848177
That does not sound like what I want to do....  Let me explain what the program does then maybe that will spark something here to help.  This program reads the file names from a network directory into an oDic dictionary.  It then reads a compare text file and reports any files missing or unknown.  The second part of this application reads the contents of each file from step 1 line by line and looks for errors in the files contents.  If it finds an error it goes to an sql database to get additional parameters and reports it to the user interface.  When there are 2,000 files you can see where this can take a while to process.  Hope this helps some ideas flow.  Multithreading as it has been explained so far does not sound practical in this case..  What would keep track of what was where....
0
 
LVL 46

Expert Comment

by:aikimark
ID: 16848556
1. Line-at-a-time processing is a potential performance bottleneck.  Consider changing your I/O to read the entire contents of the open file into a (big) string variable.  You haven't told us how big these files can get, what format they take, what processing your program takes to determine if an error occurs (other than a database call, etc., so this might not be a simple performance tweak.

2. Assuming you have some decent VB programming skills and this is a VB-classic environment, here is how you could break up the work.
  a. Repackage your file-processing code into a form-less program (Sub Main entry point)
  b. Change the processing code (a) to read the (pathed) file names from a file in the App.Path directory and output an errors file.
  c. Add some logging output to a progress file to record what file you are on and any errors you encounter.  This file might have to be opened and closed at least once for every file processed to maintain a most current view of the progress to the UI program.
  d. (optional) Add a bit of code to (a) to create an empty file (Open for output and immediately close) a flag file to indicate completion.  You may indicate completion through a line in the log file (c).
  e. While processing the file, execute a DoEvents statement every so often.  You can use a line counter and a MOD operator if using the Line Input I/O statement.

  f. Change the UI program to create N different output files from the oDIC contents.  You decide how large/small N is.  It will probably be easiest to work with multiple directories below the UI application directory.
  g. Shell the N different (a) programs and start a timer, whose Timer event will track the progress of the N different (a) programs and update the UI with the progress.
  h. When the (a) programs have completed, process the N errors files and display their contents to the user.

==================================
The goodness of item 1 is that it directly addresses your I/O-bound performance bottleneck.  The goodness of item 2 is that it distributes the work.  Alternatively, you might decide to create the errors files in the (a) processing programs without making any database calls and then make the database calls in the UI program.
0
 
LVL 4

Assisted Solution

by:NicoLaan
NicoLaan earned 800 total points
ID: 16850186
aikimark,

Why the extra log files? (2c, 2d, 2f) Because you dropped the form(2a)? Is a form not quicker than a log file?
I'm not sure if splitting the process would make it faster in this case...


Here's what I'd try:
It seems easy to split your program into seperate steps.
I'd first test the performance of each step and find the bottleneck that way, best bets are (4a) and (5).

>> This program reads the file names from a network directory into an oDic dictionary.
1) Read file names (dir command) into oDic (oDic is just the in memory storage for the results right?)
>>It then reads a compare text file and reports any files missing or unknown.
2) Read compare text file (local, remote?)
3) Report (how are things reported? to screen or file? if file, local, remote?)
>>The second part of this application reads the contents of each file from step 1 line by line and looks for errors in the files contents.  
4a) Read files (if you can, read entire file at once this might increase performance a lot [depends on vb6 file caching abilities or lack thereof], now read line by line from the string in memory)
4b) and find errors (don't do anything with them yet, maybe store in another string or memory boject)
>>If it finds an error it goes to an sql database to get additional parameters and reports it to the user interface.  
5) Get SQL info (do you now make a query for every error you find? if so, can you get all parameters into memory at once and get the parameters for each error from this memory object or is that too big?)
6) Report

To test if step 4 can be improved:
a) Copy the files to your local harddisk using windows and see how fast / slow that is.
b) See what happens if the program operates on these local files instead of on the network.

Report test results please.
0
 
LVL 46

Expert Comment

by:aikimark
ID: 16851573
@NicoLaan

The major reason for a log file:
Produce a trail of progress that separate task can monitor.  This could implemented in any number of other ways (atoms, named pipes, DDE links, windows messages, IP messages, etc.).  Outputting a log file is the simplest method to implement.  If the log file lines are timestamped (highly recommended), it would help identify performance bottlenecks.

I'm not sure where you drew the conclusion that there are multiple log files, but there is only one in the simple scenario.  As to the question of speed and efficiency, a rapidly updating form can be slower than disk I/O.

In the alternative solution paragraph, I suggested that an additional errors file could be created by the N processes, but this is not a log file, just an extract (abstract) of the processed files' errors.  This your 4b step.

If the processed files are on a file server, then there are other components that can affect the performance (switch and router speeds, speed of the slowest segment, distance to the file server, competing network traffic).  Transfering the files to the local hard drive is a good idea.

========================
Who knows...maybe there is a lot of string concatenation when the application constructs the error text or the SQL string.  This would also pose a performance bottleneck.
0
 
LVL 18

Author Comment

by:John Gates, CISSP
ID: 16863188
Ok the SQL is not where my issue lies.

I am reading 1000+ file names from a directory for a checking process.  I just need the names to compare and nothing more.  Here is the current code I use to get the names:

Set oFolder = ofso.GetFolder("G:\somedir")

For Each oFile In oFolder.Files

<code to write filename to oDic>

Is there a better way to get the filenames that would not involve the FileSystemObject?

This file access is what causes the bottleneck in the checking portion of the program.  Any suggestions would be appreciated!
0
 
LVL 29

Expert Comment

by:nffvrxqgrcfqvvc
ID: 16863278
You have 2 options available

1) Use FindFirstFile and FindNextFile API and use the unicode version.
2) Use dbghelp.dll and EnumDirTree export. (this is very fast)

You must download redist version of it you can get at my site. Then you extract  dbghelp.dll to your applications directory.
Note: Do not overwrite the dbghelp.dll version located in system32 directory make sure you save to applications directory
http://www.geocities.com/egl1044/BU/dbghelp_6.6.3.5.zip

The you can use my example. Make sure you compile your project as EXE and save your EXE to the same path as dbghelp.dll
http://www.geocities.com/egl1044/SOURCE/enumdirtree.html
0
 
LVL 46

Accepted Solution

by:
aikimark earned 1200 total points
ID: 16863632
<<Ok the SQL is not where my issue lies.>>
How do you know that?  Did you follow the suggested timing suggestions?  What did they reveal?

==============
<<I am reading 1000+ file names from a directory for a checking process.  I just need the names to compare and nothing more.  Here is the current code I use to get the names:>>
A) What is does the code to write filename to oDIC look like?

B) Is the oDIC object a Dictionary object?  What is the reference?  What is the version of the supporting DLL file?

C) Have you tried using a Colleciton object rather than a Dictionary object?

D) Do you update the UI while doing this?  If so, DON'T -- or at least only update every 50-100 files.

E) Is there a performance problem here?  If not, don't change it.

==============
<<Is there a better way to get the filenames that would not involve the FileSystemObject?>>
Try the old school way:

Dim strFilename As String
strFilename = Dir(("G:\somedir\*.*")
Do Until Len(strFilename) = 0
  'your code goes here to add strFilename or ""G:\somedir\" & strFilename
  'to a list.
  strFilename = Dir
Loop

NOTE: In general, the FileSystemObject introduces object overhead.

=============
<<This file access is what causes the bottleneck in the checking portion of the program.  Any suggestions would be appreciated!>>

F) We've already said that you should benefit greatly by reading the entire contents of each file into a string variable with the INPUT() function.  This would limit the maximum individual file size (times two) to be less than the available RAM.  The reason for the two multiplier is the internal storage of strings in UNICODE format that uses two bytes for every character.

G) You can read the entire file into a byte array, but then you wouldn't be able to use the InStr() function to look for strings.

==============
H) You will have to think outside of the box to improve this performance.  Consider the following:
H.1 - third party software like DTSearch
H.2 - use the Windows Explorer/Shell calls to perform the search
H.3 - enable indexing services on the target directory to speed the searches
H.4 - preprocess the target directory while the user is starting the application or on some regular intervals with a different process that monitors directory change activities.
0
 
LVL 18

Author Comment

by:John Gates, CISSP
ID: 16864345
Hrm... What references or components do you need for dir?  The dir in your code is not recognized....
0
 
LVL 46

Expert Comment

by:aikimark
ID: 16864667
oops...typo -- extra left paren.  

should be:
  strFilename = Dir("G:\somedir\*.*")

Dir is part of the VB language.  There are no references or components needed.
0
 
LVL 46

Expert Comment

by:aikimark
ID: 16864715
Thanks for the points.  Glad we could help.

How much speed improvement have you gained using our suggestions?  Which changes gave you the most bang-for-the-buck?
0
 
LVL 18

Author Comment

by:John Gates, CISSP
ID: 16864848
Let me finish my code changes and I will post what helped the most and the speed savings 8)
0
 
LVL 46

Expert Comment

by:aikimark
ID: 16864952
@dimante

For future reference, it would be best if you didn't close your questions before making sure the comments actually helped you.  If you have problems or questions about implementation, you will need to open a new EE question.  If you do, please include a URL link back to this question so the experts will have some historical and context reference.
0
 
LVL 4

Expert Comment

by:NicoLaan
ID: 16865114
Gone for a day and so are the points!
Well, glad the problem is solved though I'm also interested in the timings before and after and what got most results.
0
 
LVL 4

Expert Comment

by:NicoLaan
ID: 16865120
And thanks for the assist points by the way! (just noticed)
0
 
LVL 18

Author Comment

by:John Gates, CISSP
ID: 16885421
Hmm... I just noticed something very evil.  The Dir function is missing the first filename in the directory.  Anyone have any idea why that would be?  I cannot be missing any files.  I proved this by removing the file in question then the next file in line was missing.

0
 
LVL 46

Expert Comment

by:aikimark
ID: 16885783
1. what does your code look like?  (please compare with my posted DIR example before responding)

2. Do these directory files have any special properties, such as Hidden or System?

3. As far as I know, there are no outstanding service packs that affect the behavior of DIR
0
 
LVL 18

Author Comment

by:John Gates, CISSP
ID: 16885834
Yes I needed to add the following then the file was no longer skipped!

strfilename = Dir$("G:\mydir\", vbDirectory)

Without the vbDirectory directive it was skipping the first filename every time.

The default is vbNormal the way you have it listed above.  Well I just learned something new today.  Thanks for responding back.  This Dir read has improved the speed of the program by 1000%  what took 5 mins to run now runs in less than 30 seconds.  I am working on the second part of the program now.
0
 
LVL 46

Expert Comment

by:aikimark
ID: 16890640
I'd be a bit more reserved about that performance improvement figure.  You've reduced the run time by 90%.  Although I understand how you arrived at the 1000% figure, but it can be misleading.  I might use a 1000% figure when a metric figure increases, but not usually when a figure decreases.

===================
@NicoLaan, feel free to opine on performance metrics as dimante's process improves.
0
 
LVL 4

Expert Comment

by:NicoLaan
ID: 16897931
It takes the old program 1000% of the time of the new version. It takes 900% longer.
The new version takes 90% less time as you said aikimark.
If a 5 minute program would decrease in time by 1000% it would now take -50 minutes. (note the minus)
So you'd get the results almost an hour before you start the program.

However, upon reading dimante's words again, he said the speed improved by 1000%, so that is also correct.
Say the old one processed 1000 files in 5 minutes, that is 1 per 0.5 minutes, now it is 1000 files in 0.5 minutes.
We all got it right, aint primary school calculus great!

Had to correct my wording and figures a few times, hope I got it right now.
0
 
LVL 46

Expert Comment

by:aikimark
ID: 16898073
@NicoLaan

<<Say the old one processed 1000 files in 5 minutes, that is 1 per 0.5 minutes, now it is 1000 files in 0.5 minutes.>>
By my calculations 1000/5 = 1 file per 0.005 minutes (old).  Now it is 1 file per 0.0005 minutes.

<<We all got it right, aint primary school calculus great!>>
It's the quagmire of statistics.  Remember the title of, quite possibly the most famous, book on stat misuse: "How to Lie With Statistics"

I realized before my prior post that 1000% wasn't entirely incorrect.  However, as a long time Perf & Tuning specialist, I realized that these performance figures could be (correctly) interpreted in the myriad ways we've described.  While dimante implements performance tweaks, we can discuss more general P&T topics and results.  This is a teaching moment for both of us.
0
 
LVL 4

Expert Comment

by:NicoLaan
ID: 16906896
@aikimark

>>By my calculations 1000/5 = 1 file per 0.005 minutes (old).  Now it is 1 file per 0.0005 minutes.
You're right of course. I was already afraid I messed up somewhere.
I'm not really a P&T expert but do like to consider performance, some assembly background in the DOS age and enjoy tweaking windows to the best performance. But for most programming I do these days performance is not important anymore.
0
 
LVL 18

Author Comment

by:John Gates, CISSP
ID: 16906933
Looks like that is the only thing I can do to improve performance on this program.  The disk reads and line by line analysis on the contents takes time but it is also doing alot when it runs and it needs to 8).  Thanks again for all your help.
0
 
LVL 4

Expert Comment

by:NicoLaan
ID: 16906983
Too bad nothing more can be improved as it seems.
But glad it could be improved as much as it has and glad to have helped my share.

Kind regards,

Nico
0
 
LVL 46

Expert Comment

by:aikimark
ID: 16908637
@dimante

1. What about replacing the line-by-line disk reads with a single read of the entire file contents?

2. What does your code actually do to analyze the lines?  (please include some indication of how you are doing this)

3. Are you using the FileSystemObject to read the files or the more efficient native VB statements?

4. What UI (control) updating are you doing?

5. How big are these files?  (total size and number of lines)

I doubt you are anywhere close to the limit of your performance improvement.
0
 
LVL 18

Author Comment

by:John Gates, CISSP
ID: 16910680
I changed the reading logic to use the more efficient reading logic and it did not help the cause.  How am I going to be able to look at each line for errors if I read the whole file in?  UI updates during this process is just a number counter.  There are over 50,000 lines to read when this occurs.  If you think there is more that can be done in this regard I will create another question.
0
 
LVL 46

Expert Comment

by:aikimark
ID: 16910969
@dimante

Since we experts can't look over your shoulder, you must be quite specific and detailed with your reponses.

<<I changed the reading logic to use the more efficient reading logic and it did not help the cause.>>
This only partially answers my first question.  Please give us more details and code snippets as to what you are doing and HOW you are doing it.

You haven't answered question 3.


<<How am I going to be able to look at each line for errors if I read the whole file in?>>
We need to know what you are doing before we can answer that in a concise manner.  Otherwise, we'll be generalizing and rambling.


<<UI updates during this process is just a number counter.>>
How often...every line you read?...every file?...every phase of the process?

What kind of controls are you updating and how are you updating?

This is a partial answer to question 4.


<<There are over 50,000 lines to read when this occurs.>>
Each file contains 50k lines or each of the 1000 files contains 50 lines?
This is only a partial answer to question 5.

You still haven't answered question 2.


<<If you think there is more that can be done in this regard I will create another question.>>
Depending on the answers you post to my 5 questions, I might be able to improve your performance in this discussion thread or suggest a new question.  Even if you ask a new question, you will need to be very specific about what code you are using (snippets), what process (algorithm) you are implementing, what performance metrics you are measuring (elapsed, time/file, etc.), how you are measuring, your environment (CPU, RAM, file location, etc.), and what you want to achieve to be satisfied

Please be aware that we might ask you to do some tests on your data that will require you to create new VB projects that you would later throw away.
0
 
LVL 46

Expert Comment

by:aikimark
ID: 16942918
Two extra thoughts on this problem until we hear more from you...

If your process does not base some of its decisions/actions upon the filename, then the entire directory's data might be appended into a single file as a Shelled (asynchronous) process when the application starts.  

If you only have a single string to look for, you might even apply the filter as part of this combining process.

Note: you'll have to refresh your batch file knowledge

Note: If these files exist on a file server, you can create some service (or resident application) that preprocesses these files as they arrive/change.  That would greatly reduce the amount of work required by the application and improve user response down to the 1-2 second range.  Such as service can also be run on the user's PC rather than on a server.
0

Featured Post

Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

I was working on a PowerPoint add-in the other day and a client asked me "can you implement a feature which processes a chart when it's pasted into a slide from another deck?". It got me wondering how to hook into built-in ribbon events in Office.
This article describes some techniques which will make your VBA or Visual Basic Classic code easier to understand and maintain, whether by you, your replacement, or another Experts-Exchange expert.
Get people started with the utilization of class modules. Class modules can be a powerful tool in Microsoft Access. They allow you to create self-contained objects that encapsulate functionality. They can easily hide the complexity of a process from…
Show developers how to use a criteria form to limit the data that appears on an Access report. It is a common requirement that users can specify the criteria for a report at runtime. The easiest way to accomplish this is using a criteria form that a…
Suggested Courses
Course of the Month10 days, 14 hours left to enroll

572 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question