News:

MASM32 SDK Description, downloads and other helpful links
MASM32.com New Forum Link
masmforum WebSite

Wireless Internet System

Started by oex, July 23, 2010, 05:04:02 PM

Previous topic - Next topic

oex

Hey guys,

I dont know a lot about this however I did play with wireless systems a bit a few years back.... I was considering the maybe 10 wireless access points I have access to in my rural community without moving my computer and wondered.... Why such a heavy reliance on landlines and satellite for fast broadband speeds.... There are obvious security concerns but could the 100Mb? wireless bandwidth not be utilised to some degree?....

For some high bandwidth things such as Internet TV streams there will likely often be duplicate streams being downloaded at the same time in the same small area.... further much higher quality could be achieved with multiple connections each downloading part of the stream over landlines in parallel and then sharing it through wireless....

Additionally there could be less strain on the major infrastructure....

Just seem like common sense to me.... At least for low security items like streaming news feeds.... We already have the ability for people to connect to others wifi access and this is far less secure....

Maybe it just destroys 'business models' so isnt worthwhile.... Maybe there are already multi-connection download managers of some description....

Something like this would I guess enable smaller businesses to create intelligent google-style databases of webpages without massive connection strain on the system giving much freeer freedom of information :bg

You can get wifi systems that boot off CD under Linux OS and serve local content but I'm suprised there is no common sense solution on a larger scale such as with a company like BT (or I guess Google on a global scale)

I can see such a wireless transfer enabled system being invaluable and free in a mobile environment especially in major cities.... maybe this already exists in major cities?.... I dont get out much but if it does then there would be a demand in rural communities if some large company were to introduce it....

HD space is very cheap these days.... I guess you could have superfast local 'special interest' download channels.... When you look at the distribution of internet access most people end up on the same websites....

Anyways just a few thoughts.... All those unused airwaves gotta be good for something
We are all of us insane, just to varying degrees and intelligently balanced through networking

http://www.hereford.tv

Neo

Many ISPs / huge websites already have massive regional caches to help reduce latency if material is downloaded more than once, if that's what you meant.

A semi-related idea: a friend of mine wrote some sort of "driver" to let you connect to multiple wireless networks at once, but I think there are actually artificial limits due to latency and due to TCP/IP's assumption of a single connection.  e.g. conceptually you could download two images at once, one on each of two connections, but if they're in an HTML page, before you know what images to load, you have to have the HTML downloaded.  Although the HTML file is likely tiny, the latency can be (relatively) huge, especially for wireless networks.  If you just want to download a bunch of files, you could download them "in parallel" (assuming no other bottlenecks, like 500 people using the same connection full bore).

ecube

one day they'll have cross continent wireless internet :) right now they have lil cables that just stretch miles across the ocean to different continents. someone could just drive out there and snip snip and there goes millions of peoples internet :( granted they have repair boats that patrol, and fix things,but it's still a vast distance to monitor.

oex

I was already aware of ISP caching....

I also have a friend who was connecting 4 WLANs together into 1.... The idea of images and webpages, these are a bit too small.... I was thinking more along the lines of the 6 o'clock news on iPlayer or microsoft windows updates.... Things that are in the 100's of Mb that essentially everybody in an area may be downloading.... This may also open up possibilities for what people could download....

There are 5 terrestrial news channels here in the UK most people will watch the BBC news which comes down at maybe 250Kb/s at crappy quality.... If 4 people in wireless range in a block of flats are watching it over the internet the quality could be doubled with the same bandwidth.... (with no extra bandwidth issue for the BBC or other provider)

If microsoft releases a 100Mb patch this *could be split into 4 x 25Mb chunks with a bit of bit checking on 'trusted' wireless networks.... (This would also be a 75% reduction in M$ bandwidth)


Additionally for some of the big ideas like replicating the human brain wireless would seem to me to make sense as a platform.... Yes there are massive latency issues however a silicon brain maintains state.... To put 250,000 chips into 1 machine is massively more expensive than using 2,500,000 individually owned machines connected over wireless....

Distributed processing such as BOINC currently uses machines for raw processing power but why not 'float' our searchengine capabilities :lol.... Having a small proportion of HD space on computers free and transfering search results partially via wireless and partially via landlines would make access to *real* search data more intelligent, more private and better featured....
We are all of us insane, just to varying degrees and intelligently balanced through networking

http://www.hereford.tv

vanjast


ecube

Quote from: oex on July 24, 2010, 08:18:09 AM
To put 250,000 chips into 1 machine is massively more expensive than using 2,500,000 individually owned machines connected over wireless....

yeah that's why google uses hundreds of thousands of crappy computers for their servers, because they're cheap, easy to fix/replace, and it removes the single point of failure. Also the other stuff you were talking about already exists in the awesome torrent technology. unfortunately because it has the association with piracy many companies fail to utilize it.

oex

It is massively more expensive because you have to buy and run more that dont need to be bought or run.... Most people run their machines on 1% or 2% CPU, regardless of what google uses it is still a complete waste of resources for doing this job although I take your point.... Intellectual property in this case comes before intelligence....

I didnt say this stuff didnt exist, you're right the issue is that it isnt utilized.... On the one had you have the hackers and on the other political ideologies.... It's just a shame we trash the world for completely unintelligent reasons.... There is a distinct lack of responsibility for technology in the world....

Borg is the US idea of communism I think :lol
We are all of us insane, just to varying degrees and intelligently balanced through networking

http://www.hereford.tv