Monday, December 22, 2008

Browser Wars

A long time ago, in a galaxy far far away...
No wait!! That's another set of wars that we are talking about here, one that does not involve lightsabers and droids. According to Wikipedia, the term "Browser Wars" refers to the competition for dominance in the Web browser marketplace. So far, there has been two major browser wars:

Browser War I (late 1990s)

Main opponents: Netscape Navigator (aka Netscape Communicator) and Microsoft Internet Explorer (IE)

In order to compete, the two browsers kept on adding features to one-up each other. Each browser had its own implementation of JavaScript (which were not compatible). Each browser had its own set of supported non-standard HTML tags. Adding new features had higher priority over fixing bugs, which resulted in both browsers being somewhat unstable.

Microsoft delivered the winning blow by integrating Internet explorer with its Windows operating system, which made the browser readily available to every Windows user - a move that was broadly criticized.

Effect on the Web experience:
BW-I was a time of Web chaos: shaky Web-standards compliance, frequent browser crashes, and many security holes. It was hard to design Web-sites that could behave similarly on both browsers, and thus it was common for Web designers to display 'best viewed in Netscape' or 'best viewed in Internet Explorer' logos. Some Web-sites even went as far as to work only on one browser or the other. This was indicative of the divergence between the "standards" supported by the browsers and signified which browser was used for testing the pages.

Browser War II (2003 - present)

After Netscape was defeated, they open-sourced their browser code, which led to the formation of the Mozilla Foundation — a community-driven project to create a successor to Netscape. After several years, the new browser "Firefox" was born (version 1.0 was released on 9 November 2004). Since then it has continued to gain an increasing share of the browser market, and became the main competitor against Internet Explorer.

Other contenders joined the war at different points in time, including (but not limited to) Opera, Safari, and the most recent contender, Google Chrome.

BW-II differs from BW-I in a major aspect: The contenders try as much as they can to work under the umbrella of the Web-standards. All browsers have compatible JavaScript engines (except for minor differences), and support more-or-less the same set of widely-recognized HTML (or XHTML) tags. Whenever a new feature is added to a browser, it soon becomes an expected feature in all the others (e.g. tabbed browsing, pop-up blocking, phishing filters, etc.). The contenders compete mainly in the following areas:

  • Browser speed (the time it takes to load pages)

  • Resource usage (amount of Memory and CPU needed)

  • Stability

  • Security (vulnerability to malicious code, holes that can be exploited, etc.)

Until the moment of writing this post, Internet Explorer still has the major market share, but the other browsers (particularly Firefox) are more popular particularly within the IT industry professionals because of serious security flaws in IE, in addition to some of the unique features provided by the other browsers (e.g. Firefox's support of custom extensions, and the multitude of such extensions available online, which makes it possible to personalize the browser to each user's needs). Also, IE seems to be falling behind in terms of browser speed. In a recent browser benchmark comparison done by, IE proved to be the worst among all tested browsers.

Effect on the Web experience:

Unlike BW-I, the current browser war is proving to be in the best interest of the user. The competition is bringing out the best of all competitors, and providing more and more features that help enrich the Web experience. The majority of Web sites today behave exactly the same on all Web browsers, and it is considered a design-flaw if a Web site does not work correctly on a certain browser. Tools are available to encourage (and sometimes enforce) using only the recognized Web-standards when designing a Web site. These standards have been vastly extended since the first browser war, and supporting non-standard elements is no longer an issue.

Saturday, December 13, 2008

Grizzly 1.7.0

Source :

Grizzly 1.7.0 - Transport layer Details.

I've been working on integrating Grizzly transport into Glassfish ORB for quite some time now. Long since we were looking for ways and means to improve performance of CORBA request processing at the transport level in both Glassfish and JDK. That included connection management, parsing the messages, encoding and decoding messages etc. When we looked at current implementations in the open sourced field, we could not retrofit our requirements (from IIOP) into the existing frameworks. On top of this, we wanted a performance centric I/O framework. That brought up a new project called Grizzly with submodule framework using java.nio.

Here is a brief definition of Grizzly terminology:


A filter is a component when placed in stream filters the stream. The interface tied to this is ProtocolFilter which has 2 methods execute() and postExecute() The filter can also be transformed into various forms based on the needs of the Grizzly framework user. For example, ParserProtocolFilter is designated to read certain number of available bytes from a given stream and then use a given parser implementation object to parse the read byte buffers.


A context is a place holder object to tell about the current state of event processing in Grizzly framework. Context gets to life at the beginning of selection cycle and ends in the callback handlers. Context has something called recycle() to recycle the state full information. All this happens in another interesting and important class called Controller.

Controller and Event Handling:

Controller is the one which does the very important event handling in Grizzly. Controller takes 2 approaches in handling NIO events. One is, through the callback handlers. And the other is, through filter chain. The filter chain gets executed sequentially until all the filters in the chain are (Chain of responsibility pattern) exhausted or until the current filter says to break the chain at the current instant. After this, all the filters' postExecute() method gets called in a reverse order to take certain appropriate actions while exiting the filter. Please remember that, the controller uses only one approach not both at any given time.

Callbackhandlers are very obvious from the code. The events it can handle are OP_READ, OP_WRITE, OP_CONNECT. Every time there is an event, the particular onRead(0 /onWrite() /onConnect() gets executed based on the type of the event. In order to accomplish this, one needs to first define a connector handler and callback handler. Controller uses something called SelectorHandlers and ConnectorHandlers to process events on the server side and the client side. Take a look at Controller code in Grizzly workspace for better understanding.


The Connector handler is the one that falls on the client side. It uses callback handlers to do the callback action based on the event types. That means, say, we are writing a client side implementation... in that we know where to contact server (host:port info.) once call to connect(..) is made, a connection is established. ConnectorHandler has specific methods to do reading and writing in both blocking and nonblocking modes.


SelectorHandler basically runs in a separate thread. (can be configured according to onesneeds) and is a listener for a given channel. It handles / accepts events to handler new connection requests. The Controller is the guy which kicks in selector handler's selection cycle. Each selector handler upon selection,delegates the event handling like read, or write or connect with ContextTask (a thing to do) object and places the context task on a queue. Note that, ContextTask is a callable and hence gets called by a next available grizzly worker thread to do the task (callable). This is how each event is processed in Grizzly.

Pipeline (a thread pool):

Grizzly defines a default pool of threads and calls it a pipeline. Users can configure this pool implementation and it's very obvious from the code.


ProtocolChain is a chain of filters. Can be used to read, write and parsingof a given stream from a channel.


-- Copied

Sharing is Caring

Saturday, November 29, 2008

Internet Routing in Space (IRIS)

Internet Routing in Space, also known as IRIS, is a project being conducted by the U.S. Department of Defense to place an IP (Internet protocol) router on a geostationary satellite. The project is intended for military communications but may eventually be used by the private sector as well. If widely implemented, this technology has the potential to dramatically increase flexibility and traffic handling capability compared with existing satellite Internet systems.

With current satellite Internet technology, transmission of packets between earth-based end users requires that the data be sent from the source end user to the satellite, where it is received and retransmitted by a repeater. The signal then goes down to a centralized router on the surface, then back up to the satellite where it is processed by a second repeater and then sent down to the destination end user. That means every packet must be received and retransmitted at least three times and must make two complete round trips to the satellite. In the IRIS system, the satellite will receive packets directly from the source and transmit them directly to the destination, eliminating all intermediate surface nodes and requiring only one round trip to the satellite. This will reduce the latency, simplify the system, improve reliability and lower the overall maintenance cost.

The IRIS project is scheduled to be completed and the satellite launched in 2009. Cisco Systems is designing the software for the on-board router. The hardware is being built by Intelsat, the largest provider of fixed satellite services worldwide. Overall coordination will be done by the U.S. Defense Information Systems Agency.

NASA reported in November 2008 that the Agency had completed a successful test of a deep space communications network for their similar project, Interplanetary Internet.

> NASA reports on their first successful test of deep space communications network.
> BBC News describes the basics of the IRIS project.
> has more information about Internet Routing in Space.

Friday, October 24, 2008

Fantistic Contraption...When Physics can be Fun!!!

While skimming through various blogs, I came across a very interesting post on the How-to Geek. The post was about yet another online flash game, but this time it is based on physics and puzzle solving, so I thought I'd give it a try.

Fantastic Contraption turned out to be an extremely fun game to play. The objective of the game is to move the red object (usually a circular object, but sometimes other shapes) to the red area (called the "goal"). Sometimes there are some obstacles in the way. To achieve the goal, you have to build a "contraption" using the tools at the top of the screen, and use this contraption to move the red object to the goal area. To make things even harder, you can only build your contraption within the light blue area (the workshop).

The games with a large number of levels, each of which has a different arrangement of objects. One good thing about this game is that you don't have to proceed through the level sequentially. If you find yourself stuck at a particular level, you can go to the main menu, and play a different level of your choosing.

You can also save your contraptions (in mid-level, or after you've solved the level), but you have to create an account in order to do this (which is free). You will be given a link that you can share with others so they can directly see your contraption in action. You can also upload your contraptions to the server. Once you solve a level, you can view contraptions built by other users for that particular level. Here are some of the contraptions I've built (A, B, C, D, E, F, G, H, I, J, K, L, M, N, O). I am not listing them in level order, or even including the level, so that I don't spoil your experience with the game.

Warning: The game is very addictive, so make sure you don't lose track of time while playing it.
Have fun!!

Saturday, September 27, 2008

Embeded open source java repoting library

Have you ever need to create a report and want to change it's content or design or send it preameters then you export it as PDF, XLS or CSV from your java code, sure yes.

Open source java library it is now available from JasperSoft company which is the market leader in open source business intelligence tools, the reporting libraries are called 'Jasper Reports '.

Jasper Reports, it is the world's most widely used open source reporting engine. 

you can download its JAR from:

then choose the required pakage version, and choose download.

after finishing download you will find jasperreports-version.jar file is located on your hard disk.

You can copy this JAR file under your lib folder under your java project,

Now, sure you need to create you first report using Jasper reports, there is a GUI designer for creating reports by only drag ad drop report components from toolbox.

This designer is called IReport, which simplifies the development of even the most complex reports. 

you can download it from this link:

and choose your suitable installation, also IReport is now available as a plugin for NetBeans IDE 

a tutorial for IReport to start creating reports using it, is avaliable for download from:

and starts to access it from your java code which will be available soon in a new post after you will finish creating your reports using IReport.

Sharing is Caring

Thursday, September 18, 2008

I need this baby in a month, send me nine women!!

Don't be tricked by the title, I really don't need a baby in a month! Here is the story...

Yesterday, I dropped by a post on Joel on Software: "Stack Overflow Launches"... It was talking about launching a collaboratively edited question and answer site for programmers called Stack Overflow.

Stack Overflow is not like an ordinary forum, where questions and answers go in the form of a discussion with possibly wrong answers and spams in between! One of the most significant features of Stack Overflow is their voting system, where questions and answers are voted for correctness. Top voted ansewers float over the down voted ones. This helps you getting the desired solution on the top of the page. You can figure more features by visiting Stack Overflow.

"...If you’re generally interested in programming and want to learn something new every day, visit the hot tab frequently."
Intersting... I added Stack Overflow's feed to my RSS Reader. And today, I read this question:
"Under what circumstances - if any - does adding programmers to a team actually speed development of an already late project?"
And it was titled: I need this baby in a month - send me nine women! That's the mystery behind the title. The question discusses one aspect of Software Engineering during the management of the project. Answers showed important considerations when attaching new programmers to running projects. For instance:
The proposed individuals to be added to the project must have:
  • At least a reasonable understanding of the problem domain of the project
  • Be proficient in the language of the project and the specific technologies that they would use for the tasks they would be given
  • Their proficiency must /not/ be much less or much greater than the weakest or strongest existing member respectively. Weak members will drain your existing staff with tertiary problems while a new person who is too strong will disrupt the team with how everything they have done and are doing is wrong.
  • Have good communication skills
  • Be highly motivated (e.g. be able to work independently without prodding)
You can read the full answer here...
I experienced some of the points in the answer. Indeed, it takes long time to understand the code base, conventions, problem domain...etc of a running project, which can make a great waste of resources and time!!

Other answers mention The Mythical Man-Month. Which is:
a book on software project management by Fred Brooks, whose central theme is that "Adding manpower to a late software project makes it later." This idea is known as Brooks's law.
Follow the links above through the post, and enjoy! :)

Wednesday, September 17, 2008


OpenBSD is a free open source operating system based upon the Berkeley Software Distribution (BSD) for UNIX.

The OpenBSD project, coordinated by Theo de Raadt, is known in the programming community for its attention to security. His team is perhaps best known for developing OpenSSH, an open-source secure shell daemon for encrypting network packets.

The project is also known for introducing several important changes to the way the rest of the open source community works, including providing public access to content version control (CVC) repositories and commit (code change) logs. Because OpenBSD is both compact and secure, one of the most common reasons for implementing OpenBSD is as a firewall.

According to

OpenBSD is developed by volunteers. The project funds development and releases by selling CDs and receiving donations from organizations and individuals. These finances ensure that OpenBSD will continue to exist, and will remain free for everyone to use and reuse as they see fit.
The OpenBSD logo and mascot is a pufferfish named "Puffy."

OpenBSD may be downloaded from
The OpenBSD Foundation supports OpenBSD and related projects like OpenSSH, OpenBGPD, OpenNTPD, and OpenCVS.
Wikipedia's entry for OpenBSD provides more technical information and history of the distribution.
Get an executive summary of what's new in OpenBSD 4.1 at the Enterprise Linux Log.

Thursday, September 11, 2008

Large Hadron Collider


Large Hadron Collider

The Large Hadron Collider (LHC) is a particle accelerator under development by CERN, the world's largest organization devoted to particle physics. A particle accelerator, sometimes called an "atom smasher" by lay people, is a device that propels subatomic particles called hadrons at high speeds. Machines such as the LHC make it possible to split particles into smaller and smaller components in the quest for the identification of so-called elementary particles, from which all matter and energy might derive.

Watch a video about the Large Hadron Collider.

The LHC, located at CERN headquarters, conducted its first test today on September 10th, 2008. In operation, the LHC is expected to replicate, on a miniature scale, the conditions existing in the universe a tiny fraction after the Big Bang. Thus, it may be possible to discern what happened in the early evolutional stages of the universe. Among other things, the LHC may yield evidence of further dimensions beyond our familiar four (three spatial dimensions, plus time).

The LHC is expected to help physicists, astronomers and cosmologists answer questions about the nature and origins of matter, energy and the universe. For example:

  • Is antimatter simply a "mirror image" of matter or is the relationship more complex?
  • Why does matter seem to predominate over antimatter in the universe?
  • Why didn't all the matter and antimatter combine long ago, converting the whole universe into energy?
  • What is the nature of dark matter?
  • Why do only some particles have mass?

The LHC will use intense magnetic fields generated by superconductivity to accelerate hadrons in a circular path 27 kilometers (about 17 miles) in circumference. The particles will interact with the magnetic fields to gain energy with each revolution. The LHC will be capable of accelerating protons to energy levels of about 14 TeV (trillion electronvolts, where a trillion is equal to 10^12 ) or 2.2 x 10^-6 joules. Nuclei of lead atoms will be accelerated to speeds sufficient to cause collisions having energy levels near 1150 TeV or 1.8 x 10^-4 joules. The electronvolt (eV) is the amount of kinetic energy gained by an electron passing through an electrostatic field producing a potential difference of one volt.

> discusses the big questions that may be answered by the LHC.
> CERN maintains an official LHC Web site.
> Scientists will use a worldwide computer network to process data generated by the LHC.
> maintains an interactive image of the LHC.

Friday, August 29, 2008

Optimizing the conversion of numbers to strings and vise-versa

Read this interesting article to see how the guys at Mainsoft optimized the conversion of numbers to strings and vise-versa by more than 2.6x. Their goal was to leverage the performance of text-based internet protocol implementations such as XML and HTML.

The article also introduces a valuable comparison to conversion algorithms in .Net. Java. It's a good reminder for all of us to keep looking for improvements and stop taking things for granted. The source code is available too.

Sunday, August 17, 2008

TopCoder & KawigiEdit

Yesterday, I was on an SRM on TopCoder… It is more than one year long since the preceding SRM… I just got the notification email of the SRM, I remembered the past days of Algorithm Competitions and thought… Why do not I participate? It’s Saturday and I am free :-)

15 Minutes before the coding, I’ve no ready environment!! But my friend Ahmed Mounir could help… He sent me an Editor Plug-in for TopCoder’s Arena and the instructions to install it…

That was about me!! So what about TopCoder and KawigiEdit?! Just go on…

TopCoder is a web site providing online algorithm competitions. You compete with people all over the world on time based competitions to solve some problems. It gets you experience and improve your coding skills and the way you think. TopCoder’s Arena is the place where you code, register, challenge and compete with other people on SRMs.

Not only Time Based Algorithm Competitions (known as SRM “Single Round Match”), but also Design and Software Development and Assembly competitions exist. But I only tried the Algorithm competitions. You can get more info from “Just do it & Hit the link :-)”

Okay, as you still patient till this line :-D … let’s move to KawigiEdit…

KawigiEdit is an Editor installed as a Plug-in on TopCoder’s Arena. It has more features and more helpful than the standard editor of the Arena. It comes with templates that generate the structure of your code… You will need to fill in the implementation of the required function. It can also run the test cases for you. You may also save the file locally & access it using another IDE… I did so to use my more friendly IDE, Eclipse…

This is the link to download the Editor… and another for the documentation. Kawigi’s main page is here… As a quick guidance, download the editor, and do the instructions under “Installing KawigiEdit” in the documentation.

Want my advice? Give TopCoder competitions a try… You will never regret –In Shaa’ Allah- Yesterday, Egypt was ranked 30 over the world… Today, I found it in the 29th place. And there are a little more than 150 members from Egypt… You can make it better, Can you?

Monday, July 21, 2008

Building a New Computer

Although many people are already experts when it comes to building a new computer from scratch, many others are still a bit intimidated by the concept of looking for each component, and putting everything together. For those who actually want to know how to build a computer, the computer help website "The How-to Geek" has put up an excellent tutorial about building a computer from scratch.

The tutorial is mainly divided into five parts (the titles are self-explanatory):

The tutorial is very thorough, and full of pictures and screen-shots which makes it even easier to read and follow. If you are interested to learn about building computers, make sure to check it out.

Saturday, July 12, 2008

The History of Programming Languages

For 50 years, computer programmers have been writing code. New technologies continue to emerge, develop, and mature at a rapid pace. Now there are more than 2,500 documented programming languages! O'Reilly has produced a poster called History of Programming Languages (PDF: 701K), which plots over 50 programming languages on a multi-layered, color-coded timeline.

How It Started

We first saw the "History of Programming Languages" diagram, created by Éric Lévénez, while visiting our French office. We were so taken with the level of detail and the visual impact of viewing 50 years of programming history that we wanted to come up with a way to share it more widely. We started big. We printed it out full-size, all 18 feet of it, on our plotter and ran it along a wall at our Mac OS X Conference last fall. So many people came by to make notations on the diagram that we knew there would be a lot more interest and discussion if we could only get it in a more manageable format. With Éric's permission, we collected comments from our authors, editors, and friends, and rebuilt the file so we could print it at its current dimensions, 39" x 17". Éric maintains a site with his original diagram, change logs, an explanation of how he creates his charts, and links to additional resources such as Bill Kinnersley's Language List of over 2,500 programming languages. Éric also has Windows and Unix historical diagrams that he makes available for non-commercial purposes, all at

About the O'Reilly Poster

"Cool" is generally the first thing we heard from people who reviewed our poster. Then came reams of suggestions for additions to the diagram. We made only a small number of changes--in order to keep the file in a relatively manageable state that enables us to print and share the poster--but there is a high level of historical knowledge and personal experience of the events in this poster among our friends, authors, and editors. We hope to inspire and capture your comments and discussion here in our History of Programming Languages Wiki. Please note, however, that we do not intend to update the poster. Our walls aren't big enough.

Getting Your Copy

The poster is available online in PDF format (701k). You can also find full-size copies, while they last, at O'Reilly conferences (

Special Thanks

Thanks to all who reviewed and commented on this poster along the way, including Éric Lévénez, Mark Brokering, Mark Stone, Daniel Steinberg, David Flanagan, Ian Darwin, Tim O'Reilly, Mike Hendrickson, Laurie Petrycki, Geoff Collyer, and Mark Brader.

Copied, Share With Love


Tuesday, July 8, 2008


HTML 5 is the next planned revision of the Hypertext Markup Language (HTML), which is a set of markup symbols or codes that can be inserted in files intended for display on Web browsers. In 2007, HTML 5 was adopted by the new HTML working group of the World Wide Web Consortium (W3C). This group published the first public draft of HTML 5 in January 2008. Refinements may continue for years before HTML5 becomes a formal recommendation.

HTML 5 is expected to offer numerous improvements over HTML 4, including:

  • New parsing rules for enhanced flexibility
  • New attributes
  • Elimination of outmoded or redundant attributes
  • Immediate-mode drawing
  • Drag and drop
  • Back button management
  • Timed media playback
  • Offline editing
  • Messaging enhancements
  • Detailed rules for parsing
  • MIME and protocol handler registration

HTML 5 will be designed so that older browsers that do not support it can safely ignore the new constructs, producing legible Web pages in most cases even if the syntax is not compatible.

Elliote Rusty Harold, an Adjunct Professor at Polytechnic University, wrote on IBM's developerWorks pages that HTML 5 will be:

...instantly recognizable to a Web designer frozen in ice in 1999 and thawed today. There are no namespaces or schemas. Elements don't have to be closed. Browsers are forgiving of errors. A p is still a p, and a table is still a table. At the same time, this proverbial unfrozen caveman Web designer would encounter some new and confusing elements. Yes, old friends like div remain, but now HTML includes section, header, footer, and nav as well.

> The W3C has published the technical details of HTML 5.
> The W3C also explains the differences between HTML 4 and HTML 5.
> There's a handy linked index to elements and attributes in the HTML 5 specification.

Friday, June 20, 2008

The Best Freeware List

For those, like me, who love free software (who doesn't?), it can be hard sometimes finding a good piece of software that does a certain task, especially since these freeware products appear and disappear all the time. This is one scenario that has happened to me personally too many times: I find a nice freeware program, and I use it for a long time. Then at some point, I recommend it to someone, only to find that it has now become a commercial software (and not free anymore), so I have to go and try to find a good freeware product to replace it. If this has ever happened to you, then you know what I mean.

That's why I was very glad when I found Gizmo's Tech Support Alert. This site offers a list of the top freeware products out there. It is classified into categories like:

  • Security / Privacy / Encryption

  • Internet: Email / IM / FTP / File sharing / Download managements

  • Enhancements to Windows / Desktop

  • Computer maintenance / Performance

  • System utilities / Backup / Data recovery

  • Audio / Video / CD / DVD

  • ... and more.

Each category is further divided into subcategories for easy navigation. Each sub-category usually features 3 or 4 products, with a full review showing the pros and cons of each product, so that the reader can make an informed decision of which product to use. The list is also updated constantly, removing products that are not freeware anymore, and adding new software that just proved itself to be worth mentioning. The website also provides a free monthly newsletter (with a paid version that has more stuff in it), and a forum where visitors discuss freeware programs and computer problems.

This site used to be a lone effort by Ian "Gizmo" Richards, who created and maintained a highly popular list of the "46 Best-ever Freeware Utilities", but over time that list grew well beyond 46 and reached the point where it could not be maintained by one person. So, now the website has changed to wiki-style, where it relies on the contribution of dozens of volunteer editors who edit and moderate suggestions from thousands of site visitors. As a result the range of software covered is ever increasing and quality of the reviews ever improving. In some sense, some might think of it as a Wikipedia for Freeware.

At this point, whenever I am looking for a software product in a certain category, this is the first place to check. I wonder how many others do the same. Again, the URL of the website is

Wednesday, June 18, 2008

Google's New Favicon

When you are as famous and well-known as Google, the slightest changes in your identity will be noticed by millions. A couple of weeks ago, many people around the globe have noticed that the big capital "G" that was used as a favicon for the Google website was replaced with an unrecognizable lower-case "g" (which is the second "g" in "Google").

old favicon
Old Favicon
new favicon
New Favicon

Some people seem to like to new icon, while many others (including myself) seem to prefer the old one. I personally think that the old favicon was instantly recognizable as the initial "G" from "Google", while the new one does not have that quality. For a brand name like Google, I believe it is very important to have a logo (or even something as small as a favicon) that can be recognized.

Google's reason for this change (as mentioned on their blog) is the following:
..we wanted to develop a set of icons that would scale better to some new platforms like the iPhone and other mobile devices..

They also state that the selection process was not easy, since they had to chose from among more than 300 different permutations. Some of the other alternatives they came up with are:

Seeing these designs, personally I would have chosen one of the more colorful designs with the capital "G" in it, since these two features (the capital "G" and the colors) are what comes to mind when you think Google. However, they also say that this is not the final design, but rather a first step to a more unified set of icons. So, hopefully, they will come up with something better. They also welcome suggestions, so if anyone can come up with a idea for a design, they can submit it here.

So, what do you think about the new favicon? Do you like it or not? and why?

Firefox 3

Firefox 3 (Fx3 or FF3) is the third version of the popular Web browser released by the Mozilla Corporation. FF3 includes improvements to security, performance, support for developer add-ons and usability.

New features for this version of Firefox include:

  • One-click bookmarking, in which clicking a "star" button allows a user to quickly add bookmarks from the location bar, file and tag them.
  • Full zoom for Web pages, including the option to save zoom setting for individual websites.
  • A new API for microformats that developers can use to build add-ons.
  • Support for offline Web applications that will work within the browser and synchronize once connectivity is restored.
  • Resumable downloading, allowing users to continue downloads after interruptions, such as restarting the browser or resetting a network connection.
  • An add-on Manager that offers improved management of plug-ins and other third-party components.
  • Updated password management.
  • Improved graphics and font handling provides rendering improvements in CSS and support for images with embedded color profiles.

Firefox 3 also addresses malware and phishing protection in a number of ways, including:

  • Malware warnings, which alert users when they visit sites known to install viruses, spyware, trojans or other malicious software.
  • Web forgery protection, which prevents the content of pages suspected as Web forgeries from being shown.
  • Add-ons and plug-in version management, which automatically checks for and disables older, insecure versions.
  • Support for Vista parental controls, which can be set to disable file downloads.
  • Improved protection against cross-site scripting and JSON data links.
  • A site identification button that acts as a color-coded security indicator and displays information about a given site, including the presence of SSL.

Firefox uses the open source Gecko layout engine and is based on the Mozilla browser from which much of its code was originally derived. The source code for Firefox is free, open source software (FOSS) and is released under a tri-license GPL/LGPL/MPL. Mozilla has released Firefox 3 for Windows, Linux, and Mac OS X in a variety of languages.

According to Mozilla's performance tests, Firefox 3 is twice as fast as Firefox 2 and nine times as fast as Internet Explorer 7. Mozilla's memory usage tests found Firefox 3 twice as efficient as Firefox 2 and more than four times as efficient as IE7.

> Mozilla is hosting a World Download day for Firefox 3, with hopes of setting a Guinness World Record for most downloaded software on a single day.
> Deb Richardson has posted an excellent Field Guide to Firefox 3.
> Mozilla also hosts a webpage for the Firefox 3 development community.
> Mozilla's Firefox blog features links, resources and commentary about the upcoming release.

Sharing is Caring ...

Friday, May 30, 2008

Robotic Personality

Robotic personality is an advanced aspect of artificial intelligence (AI) in which smart machines display idiosyncratic human behavior. In particular, "personality" refers to the ability of a robot or personal computer (PC) to interact with people emotionally as well as on a logical level.

The notion of robotic personality is based on anthropomorphism, a tendency for people to think of certain objects or machines as having human-like characteristics. Anthropomorphism is not new. In the 1800s, Charles Babbage conceived a device called the Analytical Engine that seemed as if it would have a sense of "aliveness." Today, computers and robots have brought anthropomorphism out of the realm of science fiction. Robots can be programmed to rescue a human from a burning building or to administer medication in a hospital. High-end PC programs can learn from their mistakes (or from the errors of their users), improving performance over time. Machines can generate order from chaos, one of the prime criteria scientists use to define life.

In science-fiction books and movies, computers and androids are easy to anthropomorphize. A well-known example of anthropomorphism with respect to a computer occurs in the novel and movie 2001: A Space Odyssey. In this story, a spacecraft is controlled by Hal, a computer that becomes paranoid. A fictional android with especially human-like characteristics is Data from the series Star Trek: The Next Generation. Owners of high-end personal robots sometimes think of the machines as companions.

> The Guardian (UK) describes how robots might interact with humans in the future.
> Hammacher Schlemmer distributes an interactive robot panda with multiple personalities.
> Maja Mataric is developing a care-giving robot with a personality.

Monday, May 26, 2008

VerveEarth - Surf the web by geoghraphy

Checking my email today...
My eyes stopped on that subject "Amr Kabardy's Blog, Egypt & VerveEarth"!! What is this?

I opened it to find:
Your blog
Amr Kabardy's Blog caught our attention. I'm the founder of a recently launched startup for bloggers. We are searching the internet for the world's blogs by geography, and we found yours for Egypt. I would like to invite you to our site which....etc
WOW! So, What is that site? Here we go...

VerveEarth.. is a new idea for surfing the web. You surf the web according to the geographical locations. You brows content via browsing and interactive map of the world; You can check what is going around in your region; You can check what are people blogging about in some region; You can share with friends; You can keep an eye on your favorites; You can register and link your blog, it may get more traffic to your blog :) ...and so on.

Give it a look, it still in the beta! but the idea looks interesting... Now, you see pages linked to their locations around the world :)

Here is the link: and the FAQ
And this is a link to my destination on VerveEarth :D
Have fun :)

Saturday, May 17, 2008

رسمياً ، ويندوز اكس بي على حاسبات XO المعدة للأطفال ..

أعلنت مايكروسوفت أخيرا عن دعم نظام ويندوز اكس بي لأجهزة OLPC XO (حاسب لكل طفل) رسمياً ، بعد أن أمضت أكثر من عام في تطوير اكس بي متوافق مع XO .نسخة XO الداعمة لويندوز اكس بي ستكون متوفرة في الأسوا ق ابتداء من شهر يونيو المقبل. وسيأتي معها نظام لينكس السابق الذي كان يعمل عليه الجهاز.

نظام الاكس بي المصمم للXO سيدعم الخصائص الموجودة في الجهاز مثل خاصية e-book reading والكتابة عبر القلم (writing pad) والكاميرا ونظام الـWiFi القياسي.حسب ماتقول مايكروسوفت فإن نسخة ويندوز اكس بي في XO تحتوي على كامل الخصائص الموجودة في أي جهاز عادي يدعم نظام الاكس بي.

سعر الجهاز سيكون مرتفع قليلا لوجود نظام XP فيه ، سعر الجهاز 200$ مع نظام XP ستكون تكلفته زائدة 3$ إضافية بسبب ترخيص الويندوز ، بينما النسخة الأخرى التي سيكون فيها إقلاع ثنائي (dual boot) اكس بي ولينكس ستكون تكلفتها زائدة 7$ ثمن القطع الإضافية التي تتيح التعامل مع أكثر من نظام.

بالتأكيد هناك الكثيرون ممن هم غير راضون عن الأمر برمته ..

منقوول من

Friday, May 16, 2008

Hardy Heron

Hardy Heron is the code name for version 8.04 of Ubuntu, the open source Debian-based Linux distribution. It is widely recognized as one of the most easy-to-use desktop Linux distributions for novice users. Canonical Software, the sponsor for Ubuntu, has released Ubuntu 8.04 LTS Server Edition to target the enterprise IT market, competing with Red Hat and SuSE Linux.

Hardy Heron can be run on a PC without uninstalling other operating systems. It may be downloaded or started up from a LiveCD inserted while Windows is running. When users open Hardy Heron for the first time, a virtualization application called Wubi will allow them to create a virtual Ubuntu installation inside of Windows. The next time they boot up, Ubuntu will be available as a boot option.

In general, Hardy Heron improves on earlier versions of Ubuntu by upgrading previous functions and software rather than adding new features. The operating system includes support for CD burning, a BitTorrent client, more wireless drivers and virtual network computing (VNC). Users that dualboot into either Windows and Ubuntu are also now able to read and write directly to the Windows partition. The update also features stronger encryption and improved support for third-party plugins and drivers.

Hardy Heron is the second version of the operating system to be released under Ubuntu's Long Term Support (LTS) agreement. LTS includes security updates for five years and three years of desktop support. Hardy Heron was released in April 2008 and succeeded Gutsy Gibbon.

> Learn which distributions of Linux have a GUI install at
> You can download the most recent version of Hardy Heron at
> SearchEnterpriseLinux has a guide for learning more about Linux distributions.
> You can learn more about Hardy Heron at this wiki on

Copied ... Sharing is Caring

Tuesday, May 6, 2008

Bibliotheca Alexandrina - Behind Closed Doors

Here is the story!

As we are approaching graduation in about 1 month In Shaa' Allah, Some companies are interviewing and hiring. Some companies are announcing themselves. Some companies are showing up in the market, taking the cover off their projects, attracting fresh minds to work with them :)

Here we go... The last Sunday -May 4, 2008- ICT department of Bibliotheca Alexandrina (BA) made a tour for my class... A tour Behind the Closed Doors of BA... A tour to reveal their projects... A tour to show up the stuff they work on... A tour to express what makes them busy... A tour that attract us to work for BA... So, after this great tour, what did we see? Here are some about their projects:

Virtual Immersive Science & Technology Applications (VISTA)
Well, It's a Virtual Reality applications, taking you inside the model, giving you the sense you are there in a real world... Using a special glasses, you can see 3D objects on 4 projector screens... By visualization of models, VISTA can be helpful with many fields of science. For example:
  • VISTA can help in the study of the effect of wind on the Sphinx using a 3D simulation that shows which areas of the Sphinx are most affected by the wind.
  • VISTA can help in Architecture too, a model of the BA is built. You can navigate throw it, even inside the building, emphasizing every single detail on walls, floors...etc
  • VISTA can help in Chemistry. Studying how atoms move, joint and split inside on the state of chemical atoms.
Check here VISTA website for more information, Demos, VISTA Projects...etc

Internet Archive
Hey, Web pages on the Internet are being modified, updated, removed and added frequently. Consider how would it be annoying if you want to go back to some page you've already read, but unfortunately the page is removed or modified so you can't get the desired info. With Internet Archiving, this problem can be solved. BA is proud to have on of the two copies of the Internet Archive allover the world. Personally, I found an older version of my homepage using their URL search.

Well, You can check the Internet Archive here.

Hmmm, I don't wanna make it long for you, I'm just talking briefly about two more projects.

Universal Networking Language (UNL)
The idea is that it would be much helpful if anyone can read any book in his preferred language -even the book is already written in another language- So, what about translation from language to another. It can be awful and hard. Consider that you want to translate 5 books in 5 different languages to the other 4 languages!! You'll have to run the translation process 20 times!! Hah, that's too much. Here came the concept; By using a UNL syntax each book is transfered once into the UNL syntax and then using a UNL compiler, the desired translated version of the book is output. Well, this seems easier. The UNL is describing semantics and not syntax of different languages.

Check here for more information.

Digital Asset Repository (DAR)
What is the DAR? I quoted the answer for you...
The Digital Assets Repository (DAR) is a system developed by ISIS to create and maintain the Bibliotheca Alexandrina's digital collections. DAR acts as a repository for all types of digital material (obtained from the Library or acquired from other sources), preserving and archiving digital media, and providing public access to digitized collections through web-based search and browsing facilities.
More information about DAR is here.

This is not all what we saw. We've been to the servers rooms, digitization rooms...etc But the post is getting too long :)
Wanna know more! Go here to the ISIS (International School Information Science) website, it will tell you what you need to know :-) All these projects are part of the ISIS work.

Note: All links on this post are last accessed on May 6, 2008

16 Ways to Keep A Razor- Sharp Focus at Work

Focus is something of a novelty these days. We’ve got cellphones for texting and calls, IM, Twitter, Email, RSS feeds, Facebook, Myspace… the list goes on and on. If you don’t have ADD before you start working online, it seems it’s almost inevitable thanks to these inputs. If you’re a web worker who uses the Internet for the majority of the day, you’re especially at risk for losing focus.
Focus is something that must be fought for. It’s not something that automatically switches on when you want to. You have to make sure your surroundings are perfect for working if you want to be focused. Here’s a few ways I’ve found this to work:
  1. Use offline tools. Paper products, pens, and other physical tools are a Godsend for those of us who have a hard time focusing throughout the work day. They’re so simple that we can use them quickly, without having to worry about becoming distracted.
  2. Take more breaks. More breaks = More productivity. It may sound wrong, but it’s true. Breaks allow us to re-group our thoughts and focus for the task at hand. They also keep us fresh so that we don’t end up burning out after only a few hours work.
  3. Smaller tasks to check off. When you’re planning your day, make sure that your “action steps” (aka items in the checklist) are small actions. Instead of “Paint living room”, try breaking it down into many tasks, like “buy paint, buy rollers, pick colors” etc.
  4. Keep a steady pace. Don’t try to do to much. Keeping the pace manageable allows you to keep your focus. Unfortunately, people can confuse this with “Work till you drop without breaks”. See number 2.
  5. Keep a daily “purpose” card. It’s pretty easy to get lost staring at the computer all day long. We’ll find rabbit holes to wonder down (ie. Youtube, Myspace, etc.) if we’re not careful. Having your daily purpose card gives you clarity and a reminder as to what you’re doing today.
  6. Develop the mindset that the computer is only a tool. It’s easy to try and use the computer for too much. At its core, the computer is merely a tool (albeit a freakin’ awesome one) that allows to do work more efficiently. If we’re using it as something more than that, (like as a solution for your life), you’ll ultimately fail. It’s like trying to eat a steak dinner with only a spoon.
  7. Plan your day to the T. If you’re finding sporadic periods of laziness throughout the day, it could be because you don’t take enough breaks (see #2), and you don’t have the day mapped out as efficiently as you could. Make sure your list of todos has lots of small, actionable steps that can be done quickly. This will gives a really satisfying feeling when you’re crossing things off your list like crazy.
  8. Notice your lazy routines. Everyone has recurring lazy spots throughout the day. Plan to have your breaks for those times. You’re going to be lazy then anyway, right?
  9. Plan the night before. Planning the night before is a great way to really get focused on the next day. “Sleeping” on your tasks and goals for the following day can really help your mind expect what’s going to happen the next day. Essentially, you’re preparing your mind for the following day. Advanced focus.
  10. Turn off extra inputs. These are IM and email for me, but we all have our Achilles heel. Completely turn off any distracting piece of technology that you own. Every one of these inputs tries to steal bits of your focus. And they won’t rest until they do.
  11. Set time limits for tasks. There’s no motivation like a deadline. Giving yourself real deadlines is a great way to stay motivated and focused on the task. Given the fact that we human are natural procrastinators, it’s no surprise that we’ll take as long as we’re allowed to finish something. Setting real but attainable limits is a great way to keep the project humming, so to speak.
  12. Keep a journal of what you did throughout the day. I like to use a moleskine notebook for my lists just so I can go back and review it every now and again, to see what I’ve done. Knowing how far you’ve come can keep you sharp and motivated to finish.
  13. Use programs to track where you spend your time. This is a real eye-opener. Knowing just how much time you spend every day/week/month on a certain site or with a certain program can quickly show you where your priorities lie. I recommend Rescue Time, but there are many others.
  14. Visualize the day in the morning, before it starts. A little pre-work meditation on the day’s events is a great way to start the day off focused and productive. Don’t worry about a full 30 minute session, a quick review before you start the day is fine.
Sharing With Love

Wednesday, April 30, 2008

How to Protect Yourself from Phishing?

For those who don't know what phishing means, Wikipedia defines phishing as:

"an attempt to acquire sensitive information, such as usernames, passwords and credit card details, by masquerading as a trustworthy entity in an electronic communication."

The most common form of phishing is when someone receives an "urgent" email asking them to take immediate action to prevent some impending disaster. Here are some examples:
"Our bank has a new security system. Update your information now or you won't be able to access your account."

"We couldn't verify your information; click here to update your account."

Once a person clicks on the provided link, they are taken to a webpage that looks exactly like the legitimate web site that they know (e.g. the website of their bank). Because the page looks familiar, people enter their username, password, or other private information on the site, not knowing that they have just given their information to someone unknown, who can now use this information to hijack their account, steal their money, or open up new lines of credit in their name. They just fell for a phishing attack.

Google just issued a warning on their official blog, to warn people from these phishing emails. According to this post, you can reduce the chances of being a phishing victim by following these steps:

  • Be careful about responding to emails that ask you for sensitive information

  • Go to the site yourself, rather than clicking on links in suspicious emails

  • If you're on a site that's asking you to enter sensitive information, check for signs of anything suspicious

  • Be wary of the "fabulous offers" and "fantastic prizes" that you'll sometimes come across on the web

  • Use a browser that has a phishing filter

You can read the details of these steps here. In addition, there are several quizzes online to test whether you can differentiate between a legitimate webpage (or email) and a phishing one. Just type "phishing IQ quiz" in your favorite search engine, and enjoy!!

Wednesday, April 16, 2008

Nile University: Wireless Intelligent Networks

For the last three days, I've been attending the conference on wireless intelligent networks organized by the Nile University in the Smart Village. The conference was held under the auspices of Dr. Tarek Kamel, the minister of communications and information technology. Ohio State and RICE universities also contributed to the conference. The conference was followed by a WARP workshop, but only a limited number of the attendees was invited.

It was a great initiative from the Nile University to introduce this interesting field to the academic community in Egypt. University students were also invited to get exposed to the ongoing research in wireless networks and get in touch with the world leaders in this technology. You can find all the information you need about the event on the conference website. The conference presentations should be available soon.

The conference was more oriented to EE topics. As a CS undergraduate, I had some difficulty following up with some talks, but it was a good experience after all. I talked to some of the speakers about the role of CS students in this field and here is what I got:

"The middle east is going to become very powerful both using and developing technology. There is going to be a tremendous need for better ideas," said Prof. A. Paulraj. He also mentioned some topics of interest regarding mobile technology including: powerful processes that consumed little power, new architectures that saves power using techniques like clock gating, more user friendly interfaces suitable for dealing with more data, security and clean slate internet.

"You should take your studies very seriously," said Prof. A. El Gamal.

"Go outside traditional education. Think outside the box. Whatever you learn isn't just courses, you should find points of interlinking between the things you learn. Think about the applications of what you study. Think about services and how it can be provided in a systematic and organized manner," said Prof. M. Eltoweissy.

"If you want to make something outstanding in networks, you have to combine the knowledge from both EE and CE. Without understanding the physical layer, your work will be rather theoretical," said Prof. A. Abozeid.

Finally, I would like to mention Prof. Hesham El Gamal and the Nile University students for their efforts in organizing this conference.