Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Wanet Telecoms Ltd on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Search results for query: *

  • Users: ovince
  • Content: Threads
  • Order by date
  1. ovince

    remove 0s

    I posted this message to awk forum but other solutions are accepted too. Hi Does anybody have an idea how to remove leading zeros from this data ie. to transform this HD000245 HD001358 HD020400 HD000093 HD000007 ........ to HD245 HD1358 HD20400 HD93 HD7 ....... thanks o
  2. ovince

    function in .bashrc

    Hi I would like to put a function into .bashrc file that would: open a new Konsole and cd into the result of pwd command. Something like: pwdinnk() { p=`echo pwd` echo $p `dcop $KONSOLE_DCOP newSession` `cd $p` } This does not work. In details: 1. The echo $p part does not give me...
  3. ovince

    awk with gnuplot

    Hi All Is it posible to use awk with gnuplot? That is, can I plot data directly extracted from file by awk like: awk 'BEGIN {FS=","}{print $53, $58}' | gnuplot ... thanks oliver
  4. ovince

    bashrc variables

    Hi All, I can not find any documentation on what is the difference when I define some variable in .bashrc file like for example: COS_PATH=+$COSSPEC2D_DIR/pro:$COS_PATH and COS_PATH=$COSSPEC2D_DIR/pro:$COS_PATH (ie without +). Could somebody tell me? Also I am not sure when to export...
  5. ovince

    header

    Hi All, I join 2 files matching columns like awk 'BEGIN{FS=",";OFS=","} NR==FNR{a[$1]=$1;b[$1]=$23;c[$1]=$44;next} $1 in a {print a[$1],$1,b[$1],c[$1], $2, $3, $4, $5}' filq e1.csv file2.csv > joined.csv I have headers in file1.csv and file2,csv.How to keep corresponding headers in...
  6. ovince

    delete column

    hi All, I would like to delete 2 columns from 60 columns data. How to do that except to write awk '{print $1, $2 .... $60}' file.dat > newFile.dat leaving out columns that I dont want in a newFile.dat? thanks oliver
  7. ovince

    deleting file-content

    hi is there an easy way to delete the content of the file but leaving an empty files in the directory? oliver
  8. ovince

    string manipulation in BASH

    Hi All, I would like to extract numbers 5, 15, 35 from filenames like (the second number in name): join5x_angle_box5_tri.dat join5x_angle_box15_tri.dat join5x_angle_box35_tri.dat .... How to do this? I was thinking to apply command like `expr index "$fileName" '.*box.*'` in a loop, but...
  9. ovince

    getline problem

    hi, could somebody tell what is wrong with this command awk 'BEGIN {"read c" | getline vc close("read c"); print vc}' the idea is to read in some variable from STDIN and to use it in awk commands thanks oliver
  10. ovince

    find and copy

    hi I would like to copy files that find command finds somehow. None of this does not work find less/DH* more/DH* -type f -name "all*.dat" -exec cp "c:\users\" find less/DH* more/DH* -type f -name "all*.dat" | cpio -pdumv "c:\users\" how to do properly this? thanks oliver
  11. ovince

    normalization

    hi I would like to normalize second column of dataFile to number at 1000 row of second column. I do this by looking what number I have in 1000th row of second column first and then awk'{print $1, $2/23.45}' file Must be an easier way to do this I am sure. some suggestions? thanks
  12. ovince

    redirect

    Hi Colud somebody tell me how to make changes on file by awk and to redirect into file wih same name awk '{print $1, $2/2}' f1 > f1 does not work of course but must be a way to do it no? thanks oliver
  13. ovince

    make 4-column

    hi, I have files with 2 columns of data g1.dat f1.dat g2.dat f2.dat ... I know how to join g1.dat and f1.dat into one file with 4 columns. I do that like awk 'FNR==NR {a[FNR]=$0;next}{print $1, $2, a[FNR]}' f1.dat g1.dat > f1g1.dat I would like to do this in loop for each pair similarly...
  14. ovince

    making 2 column data filr

    hi I have 1 file with this content f1 g1 f2 g2 ... f10 g10 and I would like to make a file from it with this content f1 g1 f2 g2 .... f10 g10 How to do it in easy way? thanks oliver P.S. I remember that I read somewhere that this is easily done with cat command. Something like cat -2...
  15. ovince

    filtering

    hi Could someone help me on this? In each PINK* directory I have 'pink' fajl with some content and I wanna remove all lines containing word 'tau1' or 'tau2' (in all 'pink' fajls in all directories of course) I wrote this command find PINK* -type f -name "pink" -exec sed -e '/tau1/d' -e...
  16. ovince

    multiple task

    Hi All, I really enjoy command that I learnd few days ago. It finds directories with a particular names and do removing of particular files. Something like find DH* -type f -name "*.dat" -exec rm {} \; I was wandering, is it posible to 'employ' awk command in similar manner and how?. For...
  17. ovince

    random number

    Hi, Is it posible to generate (for example 1000) random numbers in a such a way to fill a square in plane in awk? thanks oliver
  18. ovince

    awk pipeline

    hi, I would like to use file names within awk so I tried something like this: for file in box*.dat; do awk 'BEGIN {"echo $file" | getline fl; close("echo $file"); print fl} ' $file ; done any idea what I do wrong? thanks
  19. ovince

    redirect

    Hi, I would like to do the same thing to many files and to put result into files that that has the same name as input files but with different suffixes. For example if I have datafiles like boxHUN.dat boxDNK.dat boxCHY.dat the result should be writen into boxHUN.txt boxDNK.txt boxCHY.txt...
  20. ovince

    separate

    hi All what is the easyest way (if posible with one command line) to separate large files into subfiles starting form one header to another. For example, #head 1 23434 23434 34324 #head 2 345345 #head 3 34543 34545 34545 34545 to separate into #head 23434 23434 34324 #head 345345 and...

Part and Inventory Search

Back
Top