Solved

Getting Next Record From Text File

Posted on 2015-02-16
11
35 Views
Last Modified: 2015-03-11
I time fitness events and the software that does the actualy rfid data processing submits a text file as participants finish with entries like this:

2,3586,0,"09:59:14.720",1,2
1,3400,0,"09:59:15.782",2,1
2,3586,0,"09:59:15.783",2,2
4,3399,0,"10:00:50.260",1,4
3,3294,0,"10:00:50.750",1,3

What I want to do is repeatedly append the data in my results database with everything that has been added from where I last left off.  I am already parsing this data into an array so I can access the data points I need.  The thing that I  believe i would need is to simply get the last entry based on the time of submission (say"09:59:15.783") from my database and then get everything from there on from the text file.  Do I have to scroll through the entire file to find that entry in the text file and then pick up with the next one or is there another way?  The file could have in excess of 10,000 records in it,

Thank you.
0
Comment
Question by:Bob Schneider
  • 5
  • 4
11 Comments
 
LVL 38

Expert Comment

by:Gerwin Jansen, EE MVE
ID: 40612128
10000 lines of 30 bytes would mean a 300kB file - I'd just read the whole file into an array. Then skip #lines you already have in your database (minus a few) and start searching for the last entry, any new entries you will find after that you put into your database. The above would mean that new entries only get added at the bottom of the file.
0
 

Author Comment

by:Bob Schneider
ID: 40612188
So determine the number of lines in the database, minus 10 for instance, and start there and search for the first new line?  That makes sense except that I don't take all of the lines in the text file and write to the database.

Consider a 5k.  As a runner approaches, the rfid system starts to pick up the signal from their bib/chip.  They may get read 3 or 4 times, although usually 1 or 2.  My system takes their strongest read.  With this new information, how would you modify your suggestion.  Should I create a counter that represents the number of lines that are read and just start where that number left off?  Or would it be as easy/nearly as easy and/or quick to just read through the file until we get to the last record imported into the db?
0
 

Author Comment

by:Bob Schneider
ID: 40612206
Let me amend my comment.  Which would be better from a system/efficiency/speed standpoint:

1) Read through the text file, beginning with the number of records that I have in my db (knowing that there are more than that number of lines in the text file) until I find the last value in the db and start there, or

2) Add a counter (iNumRcds) representing the actual number of records in the text file and just begin with the next one, storing that counter as a public variable.  Note that, once I closed the program, if I had to re-open it it would start at 0 (unless I stored that as a db value)

I would think 1) would be best, no?
0
Microsoft Certification Exam 74-409

VeeamĀ® is happy to provide the Microsoft community with a study guide prepared by MVP and MCT, Orin Thomas. This guide will take you through each of the exam objectives, helping you to prepare for and pass the examination.

 
LVL 38

Expert Comment

by:Gerwin Jansen, EE MVE
ID: 40612370
About the 'strongest' reading - I'd just read every record and do the filtering / sorting later in the database or application. Never throw away raw data I would suggest. I'm assuming that your readers have their own ID so you can detect an individual runner.

I would not work with any global counter, just count the amount of records you have for that race in your database and with that number in mind, read the text file. Use the number of records as in 'index' in your text file to start reading the new records. I'd just set the counter a few records back so you don't miss that 'off by one' record. Match to the last record in your database and start reading from the next onward.

BTW: you have no commercial system that does the above for you? I've seen a few systems varying from mylaps, bibtag, ipico etc. There is always a guy with a laptop watching the results pouring in as runners are passing ;)
0
 

Author Comment

by:Bob Schneider
ID: 40612557
Thanks for the feedback.  Makes sense.  Regarding the third party (commercial) system, doing this all internally gives us the most control over our database, local event timing, web-based results, sending email results, inserting finish line pix, etc.  Our chip provider has software we can buy but, again, third party software is limiting as much as it is effective.
0
 
LVL 38

Accepted Solution

by:
Gerwin Jansen, EE MVE earned 500 total points
ID: 40612633
>> doing this all internally gives us the most control over our database ...
No argument there ;) Some recommendations: start with developing a basic system first, get all unwanted 'features' out and then start adding functionality gradually.
0
 

Author Comment

by:Bob Schneider
ID: 40612687
Thanks.  BTW your point about never throwing away raw data is a good one.  That is why we never, ever modify the text file that is generated by the rfid system.
0
 
LVL 46

Expert Comment

by:Martin Liss
ID: 40659123
I've requested that this question be closed as follows:

Accepted answer: 500 points for Gerwin Jansen's comment #a40612370

for the following reason:

This question has been classified as abandoned and is closed as part of the Cleanup Program. See the recommendation for more details.
0
 

Author Closing Comment

by:Bob Schneider
ID: 40659124
Thank you very much for your help.
0
 
LVL 38

Expert Comment

by:Gerwin Jansen, EE MVE
ID: 40659238
You're welcome :)
0

Featured Post

PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
This article shows gives you an overview on SQL Server 2016 row level security. You will also get to know the usages of row-level-security and how it works
Via a live example, show how to extract insert data into a SQL Server database table using the Import/Export option and Bulk Insert.
Using examples as well as descriptions, and references to Books Online, show the documentation available for datatypes, explain the available data types and show how data can be passed into and out of variables.

839 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question