Tuesday, December 12, 2006

Doug Englebart and Ted Nelson

Doug Englebart



Dr. Douglas Engelbart was born on January 30, 1925 in Oregon. He is an acclaimed inventor, best known for inventing the computer mouse and is a pioneer of human-computer interaction. He also worked within the team who developed hypertext, networked computers, and precursors to GUIs (Graphical User Interfaces).

In the late 1940’s, Douglas Engelbart read a book by Vannevar Bush called "As We May Think." He became an early believer in Bush's idea of a machine that would aid human cognition. Later he began the Augmentation Research Centre (ARC), a development environment at the Stanford Research Institute and the On-Line System (NLS), the world's first implementation of what was to be called hypertext.

The key tools that NLS provided were:
• outline editors for idea development
• hypertext linking
• tele-conferencing
• word processing
• e-mail
• user configurability and programmability
The development of these required the creation of:
• the mouse pointing device for on-screen selection
• a one-hand chording device for keyboard entry
• a full windowing software environment
• on-line help systems
• the concept of consistency in user interfaces

Throughout his career he has continued to be a committed and vocal ambassador of the development and use of computers and networks to help provide answers to some of the world's increasingly more urgent and complex problems. Engelbart's work directly influenced the research at Xerox's PARC, which in turn was the inspiration for Apple Computers. Ted Nelson cites him as a major influence. In 1991, Engelbart and his colleagues were given the ACM Software System Award for their work on NLS.



Links of Interest


http://sloan.stanford.edu/mousesite/1968Demo.html

http://www.ibiblio.org/pioneers/englebart.html

http://en.wikipedia.org/wiki/Doug_Englebart


Ted Nelson



Theodor Holm Nelson was born June 17, 1937. He is an American sociologist, philosopher, and pioneer of information technology. Ted Nelson is a somewhat controversial figure in the computing world. For thirty-something years he has been having grand ideas but has never seen them through to completed projects. His biggest project, Xanadu, was to be a world-wide electronic publishing system that would have created a sort universal libary for the people.He is known for coining the term "hypertext." The main thrust of his work has been to make computers easily accessible to ordinary people. His motto is “A user interface should be so simple that a beginner in an emergency can understand it within ten seconds.”

In 1960, he enrolled in graduate school at Harvard. During his first year he attempted a term project creating a writing system similar to a word processor, but that would allow different versions and documents to be linked together nonlinearly, by association.

Nelson did not complete the project, but he continued to work on it after that semester and it became the overriding concern of his life. In 1965, he presented a paper at the Association for Computing Machineryin which he coined the term hypertext. Nelson's system was very similar to that envisioned by Vannevar Bush.
Xanadu

Nelson continued to expound his ideas, but he did not possess the technical knowledge to tell others how his ideas could be implemented, and so many people simply ignored him (and have ever since). Still, Nelson persisted. In 1967, he named his system XANADU, and with the help of interested, mainly younger, computer hacks continued to develop it.



Xanadu was concieved as a tool to preserve and increase humanity's literature and art. Xanadu would consist of a world-wide network that would allow information to be stored not as separate files but as connected literature. Documents would remain accessible indefinitely. Users could create virtual copies of any document. Instead of having copyrighted materials, the owners of the documents would be automatically paid via electronic means a micropayment for the virtual copying of their documents.
Xanadu has never been totally completed and is far from being implemented. In many ways Tim Berners-Lee World Wide Web is a similar, though much less grand, system. In 1999, the Xanadu code was made open source.


Links of Interest


Interview with Ted Nelson –
http://www.invisiblerevolution.net/ted-bar-it/dimensions-attr.html

http://xanadu.com.au/ted/

Sunday, December 10, 2006

VOIP

VOIP Solution

Voice over Internet Protocol, also called VoIP, IP Telephony, is the routing of voice conversations over the Internet or through any other IP-based network.

Some cost savings are due to utilizing a single network to carry voice and data, especially where users have existing underutilized network capacity they can use for VoIP at no additional cost. VoIP to VoIP phone calls on any provider are typically free, while VoIP to PSTN (Public switched telephone network) calls generally costs the VoIP user.

There are two types of PSTN to VoIP services: DID (Direct Inward Dialing) and access numbers. DID will connect the caller directly to the VoIP user while access numbers require the caller to input the extension number of the VoIP user. Access numbers are usually charged as a local call to the caller and free to the VoIP user, while DID usually has a monthly fee. There are also DID that are free to the VoIP user but is chargeable to the caller.


Legal Issues

As the popularity of VoIP grows, and PSTN users switch to VoIP in increasing numbers, governments are becoming more interested in regulating VoIP in a manner similar to legacy PSTN services.
In the U.S., the Federal Communications Commission now requires all VoIP operators who do not support Enhanced 911 to attach a sticker warning that traditional 911 services aren't available. The FCC recently required VoIP operators to support CALEA wiretap functionality. The Telecommunications Act of 2005 proposes adding more traditional PSTN regulations, such as local number portability and universal service fees. Other future legal issues are likely to include laws against wiretapping and network neutrality. As for the European Union, treatment of VoIP service providers is a decision for each Member State's national telecoms regulator, which must use competition law theory to define relevant national markets and then determine whether any service provider on those national markets has "significant market power" (and so should be subject to certain obligations). A general distinction is usually made between VoIP services that function over managed networks (via broadband connections) and VoIP services that function over unmanaged networks (essentially, the Internet).


Links of Interest

http://www.skype.com/

http://www.voip-info.org/wiki/

Participation

par•tic•i•pa•tion

1. an act or instance of participating.

2. the fact of taking part, as in some action or attempt: participation in a celebration.

3. a sharing, as in benefits or profits: participation in a pension plan.

–adjective
4. of or pertaining to a venture characterized by more than one person, bank, or company participating in risk or profit: a participation loan.

Dictionary.com Unabridged (v 1.0.1)
Based on the Random House Unabridged Dictionary, © Random House, Inc. 2006.



Wikipedia and WikiNews are two of the most prominent examples of successful and widely used collaborative web editing tools, reliant on user participation. Both, among others, highlight some of the key transformation elements of the new emerging web. Other websites such as you tube, ebay and myspace also depend on the users to not only view content, but provide the content and participate by adding more.

With the development and advance of recent technologies such as wikis, blogs, podcasting and file sharing this model is challenged and community driven services are gaining influence rapidly. These new form of media obliterate the clear distinction between information providers and consumers. The lines between producers and consumers are blurred even more by services such as Wikipedia, where every reader can become an author instantly.


Links of Interest

http://www.forbes.com/intelligentinfrastructure/2006/04/27/video-youtube-myspace_cx_df_0428video.html

Located Media


Location-based media delivers multimedia directly to the users of mobile device dependent upon their location. The media can be delivered to, or triggered within any portable wireless device that is GPS enabled and has the capacity to display audiovisual content.

Media content is managed and organised externally of the device on a standard desktop or laptop. The device then downloads this formatted content with GPS coordinated triggers applied to each media sequence. As the location-aware device enters the selected area, satellites trigger the assigned media, designed to be of optimal relevance to the user and their surroundings.

Location based media allows for the enhancement of any given environment offering explanation, analysis and detailed commentary on what the user is looking at through a combination of video, audio, images and text. The location-aware device can deliver interpretation of cities, parklands, heritage sites, sporting events or any other environment where location based media is required.

The content production and pre-production are integral to the overall experience that is created and must have been performed with ultimate consideration of the location and the users position within that location. The media offers a depth to the environment beyond that which is immediately apparent, allowing revelations about background, history and current topical feeds.


Links of Interest

http://wiki.media-culture.org.au/index.php/Technologies_-_Locative_Media

http://www.geotracing.com/

http://locative.x-i.net/

Wireless Internet

Many people use wireless networking, also called WiFi or 802.11 networking to connect their computers at home and an increasing number of cities use the technology to provide free or low-cost Internet access to residents. In the near future, wireless networking may become so widespread that you can access the Internet just about anywhere at any time, without using wires.


WiFi has a lot of advantages. Wireless networks are easy to set up and inexpensive to run. They are also unobtrusive, unless you are on the lookout for a place to use your laptop, you may not even notice when you are in a hotspot.

A wireless network uses radio waves, just like televisions, mobile phones and radios do. In fact, communication across a wireless network is a lot like two-way radio communication. What Happens:
1. A computer's wireless adapter translates data into a radio signal and transmits it using an antenna.
2. A wireless router receives the signal and decodes it. It sends the information to the Internet using a physical, wired Ethernet connection.
The process also works in reverse, with the router receiving information from the Internet, translating it into a radio signal and sending it to the computer's wireless adapter.

When the technology was first commercialized there were many problems because consumers could not be sure that products from different vendors would work together. The Wi-Fi Alliance began as a community to solve this issue so as to address the needs of the end user and allow the technology to mature. The Alliance created the branding Wi-Fi CERTIFIED to show consumers that products are interoperable with other products displaying the same branding.


Links of Interest

http://www.wi-fi.org/

Stars of C.C.T.V.


CCTV stands for Closed circuit television. It is a form of survaillance where the picture is viewed or recorded by a limited number of monitors. It differs from broadcast television in that the signal is not openly transmitted. CCTV is often used for surveillance in areas which need security, such as banks, casinos and airports. Today it has developed to the point where it is simple and inexpensive enough to be used in home security systems, and for everyday surveillance.






Basic Closed circuit television network

The widespread use of CCTV by the police and governments has developed over the last 10 years. In the UK, cities and towns across the country have installed large numbers of cameras linked to police authorities. The justification for the growth of CCTV in towns is that it deters crime, although there is still no clear evidence that CCTV reduces crime. The recent growth of CCTV in housing areas also raises serious issues about the extent to which CCTV is being used as a social control measure rather than simply a deterrent to crime.

The first CCTV cameras used in public spaces were crude, conspicuous, low definition black and white systems without the ability to zoom or pan. Modern CCTV cameras use small high definition colour cameras that can not only focus to resolve minute detail, but by linking the control of the cameras to a computer, objects can be tracked semi-automatically. For example, they can track movement across a scene where there should be no movement, or they can lock onto a single object in a busy environment and follow it. Being computerised, this tracking process can also work between cameras.

The development of CCTV in public areas, linked to computer databases of people's pictures and identity, presents a serious risk to civil liberties. Potentially you will not be able to meet anonymously in a public place. You will not be able to drive or walk anonymously around a city. Demonstrations or assemblies in public places could be affected, as the state would be able to collate lists of those leading them, taking part, or even just talking with protesters in the street.
Interestingly the use of CCTV cameras in the United States is much less common than the UK, though increasing. In 1998 there were a mere 3,000 CCTV systems found in New York City.



The most measurable effect of CCTV is not on crime prevention, but on detection and prosecution. Several notable murder cases have been solved with the use of CCTV evidence, notably the Jamie Bulger case, and catching David Copeland, the Soho nail bomber. The use of CCTV to track the movements of missing children is now routine.


Links of Interest

www.baitcar.com

www.cyber-rights.org

www.spystoreuk.com

www.iviewcameras.co.uk

www.ubermatic.org/life

Tuesday, October 17, 2006

Peer to peer networks

A peer-to-peer computer network is a network that relies primarily on the computing
power and bandwidth of the participants in the network rather than concentrating it in a relatively low number of servers. Such networks are useful for many purposes. Sharing content files containing audio, video, data or anything in digital format is very common, and realtime data, such as telephony traffic, is also passed using P2P technology.

A pure peer-to-peer network does not have the notion of clients or servers, but only equal peer nodes that simultaneously function as both "clients" and "servers" to the other nodes on the network. This model of network arrangement differs from the client-server model where communication is usually to and from a central server. A typical example for a non peer-to-peer file transfer is an FTP server where the client and server programs are quite distinct, and the clients initiate the download/uploads and the servers react to and satisfy these requests.




Some networks and channels, such as Napster and OpenNAP use a client-server structure for some tasks (e.g. searching) and a peer-to-peer structure for others. Networks such as Gnutella or Freenet use a peer-to-peer structure for all purposes, and are sometimes referred to as true peer-to-peer networks.


Classification of P2P networks


One possible classification of peer-to-peer networks is according to their degree of centralisation:

Pure peer-to-peer:
• Peers act as equals, merging the roles of clients and server
• There is no central server managing the network
• There is no central router

Hybrid peer-to-peer:
• Has a central server that keeps information on peers and responds to requests for that information.
• Peers are responsible for hosting available resources (as the central server does not have them), for letting the central server know what resources they want to share, and for making its shareable resources available to peers that request it.
• Route terminals are used addresses, which are referenced by a set of indices to obtain an absolute address.



Where P2P works:

Simple implementations that usually work for regular people. Complex instructions will cut usage. For example, setting up Napster is easy, while Gnutella (which doesn't use a centralized organizer) may need lots of understanding of IP addresses, routing, and firewalls. The implementations that must be simple include how files are uploaded off of the sharing computer. The fact that the shared copy goes to another random PC rather than a centralized server has little bearing on how easy it is. The fact that Napster has a special client-based program for uploading, like many photo web sites, is what makes it easy, not that it's P2P. Separate FTP programs or using browser primitives for uploading is not usually simple and gives uploading a bad name.

The same data on many different PCs. If only one PC has the data, access to it could be unreliable.The files are static, the information being downloaded is never changed. The files shared with Napster are not news feeds -- they are more likely the works of dead musicians.Data such that you don't mind trusting the person sharing it. If a music file for personal use was converted from CD to MP3 poorly, many people don't care. If the file being downloaded was destined for broadcast or other commercial purposes having an appropriate trust relationship with the source may be important and that complicates things and may not be practical. Lots of college students with desktops connected to local ethernets.

Where P2P probably does not work as well:

Unique content on each PC where reliability or constant availability is important. When I want your pictures I don't want to have to call you up and have you boot up your PC. (Like when you have one phone line and an old fax: call first, then tell me to connect the fax -- if I'm home -- and then call back...). Many of us live with laptops which are only connected during the day and infrequently (hopefully) at night.

Content that constantly changes. This makes the copies people recently downloaded obsolete, and effectively gets back to only one copy on one specific PC, except you may not know it.

Content that requires a trust relationship with the source. Because of the "search everywhere" nature and simple signup of Napster-style P2P, caveat emptor.
Reliable connection speed. The data you want may be on a low-speed link.


Importance of peer to peer

Peer to peer as a term has often been associated with online file sharing networks, where users trade files with one another from all over the globe. But in recent times peer to peer collaboration and the setting up of networks (both on the internet and in office intranets) has expanded to encompass a vast range of ways for people to collaborate effectively, regardless of their location.

Michel Bauwens is the founder of the Foundation For Peer To Peer Alternatives, and a strong advocate of how peer to peer is not simply a technology, but a new way of living. As the worlds of media and business shift away from a top down, hierarchical mode of operation and open up to the creativity and productivity of the public, and each individual within an organization, so society is going through necessary and timely changes.


Michel Bauwens




“The basic idea I had was that there’s a new social movement emerging which is really about extending the realm of participation to the whole of life. We live in a representative democracy, which says you can vote every four years, and choose which people will exercise power on your behalf… now we’re building tools and resources which say everybody needs to be involved, and everybody should have a voice.”

Michael Bauwens on YouTube -
http://www.youtube.com/results?search_query=michel+Bauwens&search=Search

Links of Interest

http://www.darknet.org.uk/

http://www.compinfo-center.com/int/p2p.htm

http://www.microsoft.com/technet/prodtechnol/winxppro/deploy/p2pintro.mspx