Solved

command in scripts to switch virtual consoles

Posted on 2000-02-15
5
214 Views
Last Modified: 2010-04-22
I am trying to write a script that executes a program processing a file, but I don't know what command I can use that will switch consoles and execute it there as well.

If this question isn't clear, please comment and I will try to get back to you as soon as I can.(3 or 5 x a day)
0
Comment
Question by:orz012999
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
5 Comments
 
LVL 40

Expert Comment

by:jlevie
ID: 2525855
The big question is why you'd want to switch consoles. That's not a very efficent use of the machine resources and there's almost always a better way.
Explanation please?
0
 

Author Comment

by:orz012999
ID: 2525867
I am trying to process work units for setiathome.  Each directory has a workload in it, and all you need to do to process the file is to run the program that is already in the directory.  

I am new to scripting, but is there a way that I could process all the units at one time?  Or is it just going to divide the processing time for 1 into 6, so it will be just as slow as doing 6 singularly.
0
 
LVL 3

Expert Comment

by:monas
ID: 2526156
orz,

      I believe that seti@home will grab as much processor time as is available from other tasks. Therefore theoretically you can get advantage only for moments you are getting new task to do (at this moment other process could use your spare CPU). But even then you loose some CPU for task management. And total result is unclear.

      Another reason to use several processes is if you have more than one CPU. But again, only in case if seti@home client is not clever enought to find and utilize this (I don't run it - I don't know).

      And finaly - you don't need to switch consoles to run several jobs. Unix has such concept as running jobs in background. - If you start some command "command &" you will be able to write new commands at the same time your command will run. If you need to get everyting this command writes in the file - use "command > /path/to/file &"

      Good Look!
0
 
LVL 40

Accepted Solution

by:
jlevie earned 170 total points
ID: 2526881
Yes multiple jobs can be run from one terminal window. The process is called "running a job in the background" and is done by adding an "&" to the end of the command. In this case, since each task needs to be run in it's own work directory you could do something like:

user> cd work1
user> ./task &
user> cd ../work2
user> ./task &

Or you could use Unix's sub-shell facility. That feature allows you to execute an arbitrary process in it's own copy of the shell, spawned from the current shell, like:

user> (cd work1; ./task) &
user> (cd work2; ./task) &

The subshell is invoked by wrapping the commands in (). You'll notice two things about those lines; that I've executed two commands by separating them with ";", and that I've told the subshell to change to the work dir, not the main shell.

Okay, so far we got the tasks running in the background, but if they emit anything like error messages or status info the output is going to be all mixed together on the screen. We solve that by redirecting and output sent to stdout/stderr into a file. Using the subshell example I'd do:

user> (cd work1; ./task >results 2>&1);

I can look at what's in "results" ("more work1/results") at any time during the tasks's execution without affecting the program. Furthermore, if I want to watch (in real time) what's being added to the results file I can do so by executing "tail -f work1/results").

All of this works equally well from a script file. Unix uses the same commands within a script file as you'd use on the command line. By default a script file that doesn't say otherwise will use whatever shell it was invoked from, but you can force the system to use a particular shell (sh in this case) by having "#!/bin/sh" as the first line of the file. To be able to execute a script file as if it were a program, one makes the file executable ("chmod +x script), allowing you to:

user> ./script

How useful this will be in the case of running seti@home is debatable. As monas pointed out, seti@home is designed to be run in the background as a low priority job using whatever "free cpu time" is available. "Free" in this case basically means when there isn't anything else to do. Since, the task is cpu bound, running more two copies will probably result in each getting roughly half of the available run time (the kernel will time-slice between the two jobs). Tasks that have even mixes of I/O and compute are better candidates for multiple simultaneous runs. When one task is waiting for I/O the other gets a chance at the cpu. Finally, the last consideration is memory. If the total memory usage of the two task and the OS is much greater than the amount of physical memory available, the system will have to start swapping the tasks in and out of memory. This is expensive and can cause run time for a pair of jobs to be significantly greater tahn it would have been to run the jobs back-to-back.
0
 

Author Comment

by:orz012999
ID: 2529160
Thanks for both of your help, I appreciate it:-)
0

Featured Post

Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Have you ever been frustrated by having to click seven times in order to retrieve a small bit of information from the web, always the same seven clicks, scrolling down and down until you reach your target? When you know the benefits of the command l…
The purpose of this article is to demonstrate how we can upgrade Python from version 2.7.6 to Python 2.7.10 on the Linux Mint operating system. I am using an Oracle Virtual Box where I have installed Linux Mint operating system version 17.2. Once yo…
Attackers love to prey on accounts that have privileges. Reducing privileged accounts and protecting privileged accounts therefore is paramount. Users, groups, and service accounts need to be protected to help protect the entire Active Directory …

730 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question