Unix OS

32K

Solutions

18K

Contributors

Unix is a multitasking, multi-user computer operating system originally developed in 1969 at Bell Labs. Today, it is a modern OS with many commercial flavors and licensees, including FreeBSD, Hewlett-Packard’s UX, IBM AIX and Apple Mac OS-X. Apart from its command-line interface, most UNIX variations support the standardized X Window System for GUIs, with the exception of the Mac OS, which uses a proprietary system.

Share tech news, updates, or what's on your mind.

Sign up to Post

Hi, All.

I have an issue with IPTABLES rules. Here is below image shows my iptables rules. But i can not "telnet" port 2196. I've edited directly "/etc/sysconfig/selinux". And restarted iptables. Selinux disabled.

What am i doing wrong ?

0
Cloud Class® Course: C++ 11 Fundamentals
LVL 12
Cloud Class® Course: C++ 11 Fundamentals

This course will introduce you to C++ 11 and teach you about syntax fundamentals.

At work I have a separate Unix password for unix systems. I reset my password
through a utility that gives me a temp password. Then I

sudo passwd

and change the password. I log out and log back in with the new password a few
seconds later. Success! But then a few hours later I try to log in with the new password
and it's no longer working and I have to go through the password reset process.

Anyone have a hunch what's going on?
0
I added this alias in my .bashrc file so it looks like this:

alias gh="grep \"hello world\" /folder/"

Open in new window


but when I type 'gh' into the command line I get the error:

> gh
bash: gh: command not found

Open in new window


Am I missing something?
0
I created a script that will append a cron job if it's not in the crontab file, but the problem is the appended job will not work unless I edit the crontab file using crontab -e. Is there a way to refresh cron in AIX without killing the process?
0
Dear Gurus,
I am using a curl command to transfer files as shown below . The reason is it needs to be sent to a particular port to enable tokenizing.
curl -k -v -T /export/home/flex/sample_files/Test_account_20180610.csv https://starserver.com:4443/webdav/Test_account_20180610.csv

Open in new window


This works fine.
The problem is that now I need to automate it using a scheduler.. but the file name suffix which is a date stamp keeps changing everyday.
Is there a way where I can give a curl command which just takes the file name irrespective of timestamp and put it in the webdav folder.
Something like this:

curl -k -v -T /export/home/flex/sample_files/Test_account_*.csv https://starserver.com:4443/webdav/Test_account_*.csv

Open in new window


SO that in teh webdav folder, I am getting the tokenised file with the name as that in the original location.

Please advise.

Thankyou very much.
0
Hello experts,

I'm very new to linux.

I'm trying to run the following commands on my Ubuntu machine:

COPY ./package.json /data/web-app
WORKDIR /data/web-app

Can someone please show me how to run these commands?

Cheers

Carlton
0
Unfortunately it is not clear to me how I can do it for change for client  the storage pool where data is stored,

What are the commands to be applied to tsm?

With upd copygroup give me this error :

tsm: TSM_7.1>upd copygroup SAP_PD ACTIVE PGT_DAI STANDARD Type=Archive destination=VCC_PRO_NEW
Session established with server TSM_7.1: AIX
  Server Version 7, Release 1, Level 1.100
  Server date/time: 06/05/18   16:13:12  Last access: 06/05/18   15:28:47

ANR1585E UPDATE COPYGROUP: Policy set ACTIVE cannot be modified.
ANS8001I Return code 3.

my policy

tsm: TSM_7.1>query policyset

Policy        Policy        Default       Description
Domain        Set Name      Mgmt
Name                        Class
                            Name
---------     ---------     ---------     ------------------------
SAP_PD        ACTIVE        SAP_MC        Policy set per ambiente
                                           SAP
SAP_PD        SAP_PS        SAP_MC        Policy set per ambiente
                                           SAP
0
i add my group domain in /etc/sudores file is not working
i say
sudo visudo
 and in this file after the line
%sudo   ALL=(ALL:ALL) ALL
%mydomain\\unixadmins ALL=(ALL:ALL) ALL

save file and check from /etc/sudores and it's ok.
i login as domain user to ubuntu and try to create a directory under /mnt

error:
username@domain.local is not in the sudoers file.  This incident will be reported.
0
i try to mount synology nas drev on ubuntu but in different ip rang
nas has nfs support and permission
i say
sudo mont 12.11.1.5/volume1/backup /mnt/backup/

my ubuntu har 12.10.1.8
i get connection timeout
amd with smb connection
sudo mount -t  cifs //12.11.1.5/volume1/backup /mnt/backup user=Administrator
get mount error(115) Operation now in progress
0
On tivoli tsm I need to create a new storage pool which will point all my nodes
What operations should I do?
0
Cloud Class® Course: CompTIA Cloud+
LVL 12
Cloud Class® Course: CompTIA Cloud+

The CompTIA Cloud+ Basic training course will teach you about cloud concepts and models, data storage, networking, and network infrastructure.

Hi everyone!
I am trying to do a few things in my bash script (script1.sh):  pass logicals/variables to my SAS program (program1.sas), execute SAS code (program1.sas), check for errors, write all messages to a log file (sas_program.log), and also execute another script (script2.sh) which has a few directory logicals/variables shortcuts.

Please note: variable/logicals (file1, file2, plog, and stat) are also defined in the SAS program.  

Please provide any suggestions or examples if you can.
Here is what I have so far:

#script1.sh

#!/bin/bash

#executing script2.sh script 
source script2.sh

# Creating variables for directories
export DATADIR1=/root/alldirs/2018/data1
export DATADIR2=/root/alldirs/2018/data2
export DATADIR3=/root/alldirs/2018/data3
export PROGDIR1=/root/alldirs/2018/prog1


#Creating variables for data files that are used in program1.sas and final log file 
export file1=DATADIR1/sasdata1.sas7bdat
export file2=DATADIR2/sasdata2.sas7bdat
export plog=DATADIR3/sas_program.log

#Setting stat variable for future use
stat=0

#executing SAS program, checking for errors, and writing all messages to a log file 
sas ${PROGDIR1}/program1.sas -log ${DATADIR3}program1.log
stat=$?
   if [[ "$stat" !=0 ]]; 
     then echo -e "--- Error: Abnormal end in program program1.sas. Check log file! ---" >> $plog
    exit 2
             echo -e "--- Program1.sas run successfully! ---" >> $plog
   fi
exit 0

Open in new window




Any suggestions would be appreciated!
Thank you!
0
hi,

Did you all experienced any kind of DB2 file corruption on unix/linux OS?
0
What percent use Windows versus Mac versus other?

I am curious to see the difference...

Thanks
0
How to test environmental logicals/variables in Unix/bash script with another Unix/bash script?  

I have below script that creates environmental logicals/variables, - those logicals/variables will be used in some other future scripts as well for efficiency reason. The purpose is to use only logicals/variables and not full directories in future scripts.  What would be another script that would test each of theses logicals/variables? Please provide an example if you can.

#!/bin/bash
export DATADIR1=/root/alldirs/2018/data1
export DATADIR2=/root/alldirs/2018/data2
export DATADIR3=/root/alldirs/2018/data3
echo "This script just stopped running"

Open in new window


Any suggestions would be appreciated!
Thank you!
0
I am attempting to suppress Last Login information being displayed for a specific user when sudo'ing a command.

I have set the attribut "DISPLAY_LAST_LOGIN" to equal 0 for that user in the userdb, however it is still showing the Last Login information.

Our OS is HP UX, realease B.11.31.

root myserver as livehost:/usr/users/myuser/scripts/test $ userdbget -u myuser
myuser DISPLAY_LAST_LOGIN=0

Open in new window


What am I missing?
0
Hi,
I have big Unix file aa.0016.s0 which I am unable to open in notepad ++ can make that file small and then open it in notepad

Thanks
0
Individual log  files on my server getting zipped every 1 hour and individual log files getting deleted.

PRoblems i am facing with zipped log files are
1. not able to grep them as easily as individual files

2. not able to tail to see any recent issues



if i copy over to local windows laptop using winscp and unzil anf try to open individual file using notepadd++ says too huge file to open.


how to extract zip file in unix box itself and check log files by doing grep and tail etc

please advise
0
is there way to display disk serial number using lspv in AIX 7.1 ?
0
Hi Team,

I have the below file .
ID            Name      Hours Worked      Hourly Pay
1425      Juan      18                        14.25
4321      George      22                        21.11
6781      Anne      44                        16.77
1451      Ben            36                        21.77
2277      Tuan      16                        18.77

I need help on the below comamnds on grep, Iam new to grep

I need to display the hourly pay of anne (only the last field )
using grep command.

Secondly , One line comamnd to find the id and hours worked for employees who earn more than $20 per hour.
0
Cloud Class® Course: Microsoft Exchange Server
LVL 12
Cloud Class® Course: Microsoft Exchange Server

The MCTS: Microsoft Exchange Server 2010 certification validates your skills in supporting the maintenance and administration of the Exchange servers in an enterprise environment. Learn everything you need to know with this course.

We have a request from applications team to grant their non-privileged Solaris and AIX ids to be
able to execute their Shell scripts (which contains lines to run binaries) :
  sudo /gl/_ctron_/start1292
  sudo /gl/_ctron_/start1291

Q1:
Is there any way not to grant them sudo & root and yet still allow them to stop/start the services?
Or if we grant sudo, restrict them to run only those specific scripts & their sudo can't do anything else?

Q2:
Any way we can use SGID or SUID sticky bits to grant them without giving them root/sudo privileges?
0
Is there any unix script or shell script  which can fetch all the DB objects from a particular schema along with the proper extension such as (.pks, .pkb etc) to my local system.
0
Currently writing a management menu interface for managing printers on our HP UX system.

Scripts are written in Korn Shell.

The following command
lpstat -pTERRYTST

Open in new window

returns the following:-

printer TERRYTST disabled since Apr 10 08:36 -
        new printer
        fence priority : 0

Open in new window


I need to be able to search the response for keywords to identify states.

So I need to fill a variable with 1 if the printer is disabled, another variable with 1 if its new.

No idea how to complete this in Korn Shell scripting as am very new to the language.

Help! :)
0
I have a system AIX with 6.1.00 tl9 is necessary install fix for sendmail-cve-2014-3956 ?

The problem exists only if I go to the internet world?
0
I need to extract a bunch of zip files, but I have requirements.

The zip files are scattered in various folder like this

  • base_folder/
  • base_folder/batch_1/batch_1_1.zip
  • base_folder/batch_1/batch_1_2.zip
  • base_folder/batch_2/batch_2_1.zip
  • base_folder/batch_2/batch_2_2.zip
  • base_folder/batch_2/batch_2_1.zip
  • base_folder/big_batch/batch_a/batch_a_1.zip
  • base_folder/big_batch/batch_a/batch_a_2.zip
  • base_folder/big_batch/batch_b/batch_b.zip

I want to extract the files to another folder and keep the same folder structure

  • base_folder/extracted/
  • base_folder/extracted/batch_1/file_1
  • base_folder/extracted/batch_1/file_2
  • base_folder/extracted/batch_2/file_2_1
  • base_folder/extracted/batch_2/file_2_2
  • base_folder/extracted/batch_2/file_2_1
  • base_folder/extracted/big_batch/batch_a/file_a_1
  • base_folder/extracted/big_batch/batch_a/file_a_2
  • base_folder/extracted/big_batch/batch_b/file_b

I want to extract everything in the zip file except for files that have certain extensions

  • file - OK
  • file.exe - OK
  • file.xml - OK
  • file.txt - Not OK
  • file.xls - Not OK

Is this possible with a few Unix commands? If it isn't entirely possible, what is the closest I can do?
0
tail -n5000 xyz.log

above shows last 5000 lines right


if i want to see all the 15723 lines of xyz.log what command i have to give

tail -n5000 xyz.log|grep 'ERROR WS'
how to make above case sensitive search like
tail -n5000 xyz.log|grep 'error ws'
how to make above whole word search? so that i wont see below as result ERROR aaa WS etc


please advise
0

Unix OS

32K

Solutions

18K

Contributors

Unix is a multitasking, multi-user computer operating system originally developed in 1969 at Bell Labs. Today, it is a modern OS with many commercial flavors and licensees, including FreeBSD, Hewlett-Packard’s UX, IBM AIX and Apple Mac OS-X. Apart from its command-line interface, most UNIX variations support the standardized X Window System for GUIs, with the exception of the Mac OS, which uses a proprietary system.