Hi!
How can I make a program in C, that reads some text information that is being displayed in a website?
I found this link:
Interprocess Communications (Windows)
But I don't know how to start...
Thanks!
Hi!
How can I make a program in C, that reads some text information that is being displayed in a website?
I found this link:
Interprocess Communications (Windows)
But I don't know how to start...
Thanks!
Does it specifically need to be when it's displayed in a browser? Because it may be easier, not to mention MUCH more portable (across OSs and versions of your browser) if you were to use networking functions/libraries to download the web page and then parse it.
There are some libraries that are available that can help to download/parse web pages. I believe that libcurl is one, so you may want to google that. I'm not sure if it does parsing but it should get you started.
If you are dead-set on reading it out of your actual browser it will probably be much more difficult. I'd say it would rely on reading the memory of the other browser or somehow locating where the browser is storing files.
Well that would be the hard way of doing it!
If you want to just read a site and grab the HTML, then a library is probably the simplest approach.
cURL and libcurl
If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
If at first you don't succeed, try writing your phone number on the exam paper.
Yeah, first i thought about saving the HTML and take the information that I need from there. But the project must specifically read the text displayed in the browser (which may change).
Is there a way to make the c program to save the webpage from firefox periodically?
Thanks
Hi Salem!
I'll study the libray you sugested.
Thanks!
Are you trying to do this?
Web scraping - Wikipedia, the free encyclopedia
If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
If at first you don't succeed, try writing your phone number on the exam paper.
No.
For example:
There is a home broker site with the prices of shares. I need to take the prices and do some math on them to make a decision, based on my model.
Ouch.Hi Salem!
I'll study the libray you sugested.
Thanks!
Well if you don't want to actually get messy with fetching pages yourself, consider...
Wget for Windows
Just give it a URL, and a filename to save it in - job done
Though TBH, I'd probably do this in Perl (or similar).
Not only does it have libraries already for fetching pages with minimal effort, there are plenty of tools for extracting the data as well. The entire MO for Perl is easy text handling and processing
String parsing and handling in C is exceedingly messy by comparison.
Have you checked the firefox plug-ins page for similar kinds of tools?
Even if there is nothing that does what you want, there may be something "close enough" that might be tempting as an alternative starting point.
If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
If at first you don't succeed, try writing your phone number on the exam paper.