• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 524
  • Last Modified:

How to modify the header in spool with Linux/Oracle?

Hello, making a spool from the attached script,
should change the headers of the output, which now shows:

2013/09/24 17:30:00! 10027! 0! PSM_4SA4______I! 223.040253! # ..! 0! # ..! 0! # ..! 0! # ..!
2013/09/24 17:30:00! 10024! 0! PSM_4SA1______I! 167.887207! # ..! 0! # ..! 0! # ..! 0! # ..!

Open in new window


to show me the column names, separated by "!" :

UTCTime! pointnumber! pointname! valor_inst!tlq_inst!valor_prom!tlq_prom!valor_max!tlq_max!utctime_max!valor_min!tlq_min!utctime_min
2013/09/24 17:30:00! 10027! 0! PSM_4SA4______I! 223.040253! # ..! 0! # ..! 0! # ..! 0! # ..!
2013/09/24 17:30:00! 10024! 0! PSM_4SA1______I! 167.887207! # ..! 0! # ..! 0! # ..! 0! # ..!

Open in new window


What option would I use?
thanks
Test.sh
0
carlino70
Asked:
carlino70
  • 2
  • 2
1 Solution
 
sventhanCommented:
Try this

Modify your script like this ... Add a another select for selecting all your headers as shown below ....

--host echo Copying out ANALOG_01
spool $OUTPUT/ANALOG_01_$FECHA.txt
select UTCTime! pointnumber! pointname! valor_inst!tlq_inst!valor_prom!tlq_prom!valor_max!tlq_max!utctime from dual
union all
SELECT
TO_CHAR(H.UTCTIME,'YYYY/MM/DD HH24:MI:SS')||'!'||TO_CHAR(H.POINTNUMBER)||'!'||S.POINTNAME||'!'||TO_CHAR(H.VALUE_INST)||'!'||ge_pkt_conv_funcs.f_convert_tlq(H.TLQ_INST)||'!'||TO_CHAR(H.VALUE_PROM)||'!'||ge_pkt_conv_funcs.f_convert_tlq(H.TLQ_PROM)||'!'||TO_CHAR(xa_time_cnv.utc_to_loc(H.UTCTIME_MAX),'YYYY/MM/DD HH24:MI:SS')||'!'||TO_CHAR(H.VALUE_MAX)||'!'||ge_pkt_conv_funcs.f_convert_tlq(H.TLQ_MAX)||'!'||TO_CHAR(xa_time_cnv.utc_to_loc(H.UTCTIME_MIN),'YYYY/MM/DD HH24:MI:SS')||'!'||TO_CHAR(H.VALUE_MIN)||'!'||ge_pkt_conv_funcs.f_convert_tlq(H.TLQ_MIN)||'!'
FROM ANALOG_01 H, xaJsdb.AnalogPoint S
WHERE H.POINTNUMBER = S.POINTNUMBER AND UTCTIME >= trunc(sysdate)-$T_ATRAS-1 AND UTCTIME < trunc(sysdate)-$T_ATRAS
ORDER BY UT
0
 
sventhanCommented:
You can add quotes if you get any errors  like below for the header SQLs

select "UTCTime!", "pointnumber!" etc  from dual
0
 
carlino70Author Commented:
Thanks sventhan, look at this:

select 
'UTCTIME!POINTNUMBER!POINTNAME!VALOR_INST!TLQ_INST!VALOR_PROM!TLQ_PROM!UTCTIME_MAX!VALOR_MAX!TLQ_MAX!UTCTIME_MIN!VALOR_MIN!TLQ_MIN!'
from dual
union all
SELECT
TO_CHAR(H.UTCTIME,'YYYY/MM/DD HH24:MI:SS')||'!'||TO_CHAR(H.POINTNUMBER)||'!'||S.POINTNAME||'!'||TO_CHAR(H.VALOR_INST)||'!'||ge_pkt_conv_funcs.f_convert_tlq(H.TLQ_INST)||'!'||TO_CHAR(H.VALOR_PROM)||'!'||ge_pkt_conv_funcs.f_convert_tlq(H.TLQ_PROM)||'!'||TO_CHAR(xa_time_cnv.utc_to_loc(H.UTCTIME_MAX),'YYYY/MM/DD HH24:MI:SS')||'!'||TO_CHAR(H.VALOR_MAX)||'!'||ge_pkt_conv_funcs.f_convert_tlq(H.TLQ_MAX)||'!'||TO_CHAR(xa_time_cnv.utc_to_loc(H.UTCTIME_MIN),'YYYY/MM/DD HH24:MI:SS')||'!'||TO_CHAR(H.VALOR_MIN)||'!'||ge_pkt_conv_funcs.f_convert_tlq(H.TLQ_MIN)||'!'
FROM ANALOG_01 H, xaJsdb.AnalogPoint S
WHERE H.POINTNUMBER = S.POINTNUMBER AND H.UTCTIME >= trunc(sysdate)-1 AND H.UTCTIME < trunc(sysdate)
ORDER BY 1;

Open in new window

It works with "ORDER BY 1", instead "ORDER BY UTCTIME". Also all the headers must be contiguous inside single quote.
Regards.
0
 
slightwv (䄆 Netminder) Commented:
Even though this has already been closed out I wanted to offer an alternate solution.

the PROMPT command outside of the SELECT.

In the original script you posted, after the SPOOL and before the SELECT:
prompt UTCTime! pointnumber! pointname! valor_inst!tlq_inst!valor_prom!tlq_prom!valor_max!tlq_max!utctime_max!valor_min!tlq_min!utctime_min

Open in new window

0
 
carlino70Author Commented:
Thanks slightwv, It is a good alternative idea.

Regards
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: IP Lookup

Get more info about an IP address or domain name, such as organization, abuse contacts and geolocation.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

  • 2
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now