TECHNOLOGIES
FORUMS
JOBS
BOOKS
EVENTS
INTERVIEWS
Live
MORE
LEARN
Training
CAREER
MEMBERS
VIDEOS
NEWS
BLOGS
Sign Up
Login
No unread comment.
View All Comments
No unread message.
View All Messages
No unread notification.
View All Notifications
Answers
Post
An Article
A Blog
A News
A Video
An EBook
An Interview Question
Ask Question
Forums
Monthly Leaders
Forum guidelines
Stefanie
NA
1
0
Reading from large text files
Oct 31 2006 9:25 PM
Hello, I have a bit of a design problem that I need help with. Any suggestions are appreciated. I've written an application that will go through lines in a log file, parse the text, and insert appropriate data into a database. A new log file is generated from an auth server every 30 minutes, and is written to often. In the 1/2 hour period, one log file can get up to about 15,000 lines. I have no control over the size of these log files. I need to read the updates to the log file as quickly as possible, so I've used a FileSystemWatcher to fire the event when the file is changed. Each line in the log contains a timestamp (in chronological order), which I use to keep track of the last line I read. Of course, I still have to scan through all of the lines until I get to the ones that I have not yet read. I can't delete old lines of this file because when the auth server is done writing to it, I need to move it into another folder. If it's not obvious already, my problem is that towards the end of the 1/2 hour, this file is huge and reading from it every time there is an update hogs the CPU. Given the constraints, I'm not sure how else to handle this problem. Help!
Reply
Answers (
0
)
Creating a file remotely in C#
Unable to show the updated file using app domain