News:

MASM32 SDK Description, downloads and other helpful links
MASM32.com New Forum Link
masmforum WebSite

Web browser source code request..

Started by vanjast, November 19, 2011, 06:55:35 PM

Previous topic - Next topic

vanjast

Did a search but seem to be going around in circles..

Can anyone point me towards Web browsing source code (assembly) and the background knowledge requirements to make a simple browser.
I'd imagine it'll be something like a hmtl code (and/or other) parser of sorts.

Thanks
Van

dedndave

i think it was Edgar that had some code for that
let me see if i can find it....

dedndave

ok - it was Rob - but i can't find the thread
here is the attachment
it looks like he adapted code by NaN - perhaps that's the user name we need to search


vanjast


vanjast

I didn't explain myself properly - Here's what I'd like to do.

When a webpage downloads, I'd like to scan the html (or whatever it is) file that downloads, looking for things like adverts, javascipt..etc
The idea is to create a dynamic user web filter. Linked to this will be a TCP/FTP/UDP/etc monitor that will trap any asp type link attempts.
AND..added to this will be a 'Junk Feeder' that will allow the user to swop a transmitted packet with 'junk'.

Essentially a privacy protector package. Maybe this has been done already... but I'll like to try it.. for phun!!
:bg

zemtex

ad muncher is such a program that filters ads and it was written in asm. I know the guy who maintains the ad filter database, I talk to him on a regular basis.
I have been puzzling with lego bricks all my life. I know how to do this. When Peter, at age 6 is competing with me, I find it extremely neccessary to show him that I can puzzle bricks better than him, because he is so damn talented that all that is called rational has gone haywire.

vanjast

That could be interesting...  :bg
It must be via scanning the html/asp downloaded (generally I'd block any asp type downloads), but it'll be interesting to seewhat flows up and down the net.

I'm busy playing around at the moment clicking on my usual bookmarks (KMeleon) and 99% of the webpages link to other monitoring links.
All going through port 80 - There's no other way.. and one cannot block Port-80
Besides that the 'monitors' have a sh-i,tload of IP addy's to link up on, and to manually filter through a router is... manual labour.
and my router runs on Javascript.... Why..Why do these monkeys do this ??? :tdown

I must make my own dsl router  :green2

Gunner

Write a proxy and filter everything that goes through it.
~Rob (Gunner)
- IE Zone Editor
- Gunners File Type Editor
http://www.gunnerinc.com

vanjast

It's a thought..
My main aim was to provide a simple 'secure' browser under the user preference control by 'hammering the page' as it downloads
A Proxy might be bypassable if any software has access at the Kernel level
:8)

vanjast

An idea which I think will be much better...
If I place a 'net-buffer' (so to speak) between my PC and router and connect the NetBuffer to the PC via USB, I'll have 'untamperable' (code wise -that is) filter.
In simple terms there will be no way that webmasters/programmers can circumvent the filter without access to the firmware... Also this little 'goodie' can be used for 'secure' comms.

I'll get down to making one of these right away...  :bg
:8)


jj2007

You could use URLDownloadToFile to get the file without risk. Then filter out everything that is "not essential" (by the way, MasmBasic is very, very good at that, with e.g. the Recall, Instr and Replace$ functions).
The problem is that nowadays an awful lot of essential things are being generated on the fly with JavaScript, using the DOM. What you get with filtering is mot likely almost unusable...

dedndave

yes - you aren't just talking about reading HTML files
there are so many other file types that one can lose count - lol
as Jochen mentioned, java is a big one - there are also php, css, asp/aspx, and so on
not to mention swf and all the other types of active content

when you start stripping all this stuff away, you can turn a robust webpage into a rather dull page of plain text
in many cases, the result is unusable - facebook and yahoo are good examples

vanjast

Quote from: jj2007 on November 20, 2011, 07:28:20 AM
You could use URLDownloadToFile to get the file without risk. Then filter out everything that is "not essential" (by the way, MasmBasic is very, very good at that, with e.g. the Recall, Instr and Replace$ functions).
The problem is that nowadays an awful lot of essential things are being generated on the fly with JavaScript, using the DOM. What you get with filtering is mot likely almost unusable...

That looks usefull