.CO.NR Free Domain KAMAL KUMAR: 2009

Search This Blog

Tuesday, October 27, 2009

Linux History


In this lesson , we will learn about Linux history and open source software.In 1991 in Helsinki University, Finland Linux was born by Linus Torvalds where he began developing a UNIX-like kernel.He first announced his work in a now-famous e-mail message on the comp.os.minix mailing list :
From: torvalds@klaava.Helsinki.FI (Linus Benedict Torvalds)
Newsgroups: comp.os.minix
Subject: What would you like to see most in minix?
Summary: small poll for my new operating system
Message-ID:1991AUG25.205708.9541@klaava.Helsinki.FI
Date: 25 Aug 91 20:57:08 GMT

Organization: University of Helsinki

Hello everybody out there using minix -

I'm doing a (free) operating system (just a hobby, won't be big and
professional like gnu) for 386(486) AT clones. This has been brewing
since april, and is starting to get ready. I'd like any feedback on
things people like/dislike in minix, as my OS resembles it somewhat
(same physical layout of the file-system (due to practical reasons)
among other things).
I've currently ported bash(1.08) and gcc(1.40), and things seem to work.
This implies that I'll get something practical within a few months, and
I'd like to know what features most people would want. Any suggestions
are welcome, but I won't promise I'll implement them :-)
Linus (torvalds@kruuna.helsinki.fi)
PS. Yes - it's free of any minix code, and it has a multi-threaded fs.
It is NOT protable (uses 386 task switching etc), and it probably never
will support anything other than AT-harddisks, as that's all I have :-(.

Kernel is the core of the Linux operating system and it is layer between the software and the hardware that manages and controls resources such as processor and memory.

Despite of the humbleness of the email, Torvald’s kernel was to become a professional kernel held in the highest regard in the computing world. Today, Torvald’s kernel, along with tools from the GNU project and elements from other open source projects (X from Xorg, for example), make up the core of Red Hat Enterprise Linux and other Linux distributions, such as the Fedora Project.




Open source software:
Software whose source code is freely available to all is known as open source software ( OSS ). The purpose of OSS is to encourage collaborative work, often through broad participation in software projects across business and geographical boundaries.
Two main groups that promote the benefits of OSS define it in different ways. The Open Source Initiative (OSI) defines OSS as having the following features:

  • The software and source code must be freely distributable
  • All users must be able to modify the source code and create derived works
  • To maintain the integrity of the original author’s work, the license may require that modifications to the code be provided in patch form
  • The license has to be inherited, so that those who receive a distribution are subject to the identical terms
  • The license must be nondiscriminatory with respect to persons, groups, or fields of endeavor; it must be free of restrictions that can limit the license. For example, it may not require that the software be a part of a particular distribution; it must not restrict other non-OSS software and it may not require the use of technology to apply the license.
See http://www.opensource.org/docs/definition.php for specific terms.
An alternative definition of open source software comes from the free software movement, which emphasizes the ethical aspects of software and source code availability. As presented by the GNU Project and the Free Software Foundation at http://www.gnu.org/philosophy/free-sw.html, software is free if it satisfies four freedoms:


  • The software must be freely executable for any purpose
  • The source code must be available so that others can study how it works
  • The software must be freely redistributable
  • All users must be free to modify the software

Still other licenses, such as the Berkeley Software Distribution (BSD) license, enforce other interpretations of open source. For example, the BSD license does not enforce inheritance, meaning that BSD-licensed software may be incorporated into closed-source projects.













Concepts in IP Security

No topic related to the Internet, with the possible exceptions of the fl ee availability of pornography and the plague of unwanted spam email, has received more attention in the mainstream media than “ security. ” For the average user the concerns are predominantly viruses that may infect their personal computers, causing inconvenience or damage to their data. Increasingly we also hear about white-collar e-criminals who steal personal fi nancial details or defraud large institutions after illegally gaining entry to their computer systems. We are also now all familiar with catastrophic failures of parts of the Internet. Although these are sometimes caused by bugs in core components (such as routers) or by the perennial backhoe cutting a cable or fi ber, they are increasingly the responsibility of individuals whose sole joy is to pit their wits against those who maintain the Internet.

Sometimes known as hackers, these people attempt to penetrate network security, or cause disruption through denial of service attacks for a range of motives. Corporate espionage is of relatively little concern to most people, but within every forward-looking company there is a person or a department responsible for keeping the company’s secrets safe. At the same time, the populist war against terrorism invokes contradictory requirements—that the government should be able to keep its information private while at the same time examining the affairs of suspects without them being able to hide their communications. Whatever the rights and wrongs of the politics and sociology, Internet security is a growth industgry. This chapter provides an overview of some of the issues and shows the workings of the key security protocols.

It introduces the security algorithms without going into the details of the sophisticated mathematics behind encryption algorithms or key generation techniques. For this type of information the reader is referred to the reference material listed at the end of the chapter. The first sections of the chapter examine the need for security, where within the network it can be applied, and the techniques that may be used to protect data that is stored in or transmitted across the network. There then follows a detailed examination of two key security protocols: IPsec, which provides security at the IP packet level, and Transport Layer Security (TLS), which operates at the transport layer and provides the Secure Sockets Layer (SSL). After a brief discussion of some of the ways to secure Hypertext Transfer Protocol (HTTP) transactions, which are fundamental to the operation of web-based commerce, the chapter describes how hashing and encryption algorithms are used in conjunction with keys to detect modification of data or to hide it completely—the Message Digest Five (MDS) hashing algorithm is presented as the simplest example. The chapter concludes with an examination of how security keys may be securely exchanged across the network so that they may be used to decrypt or verify transmitted data.

Strengths and Weaknesses of Firewalls


At best, a firewall protects a network from undesired access from the rest of the
Internet; it cannot provide security to legitimate communication between the
inside and the outside of the fi rewall. In contrast, the cryptography-based security
mechanisms described in this chapter are capable of providing secure communication
between any participants anywhere. This being the case, why are fi rewalls so
common? One reason is that fi rewalls can be deployed unilaterally, using mature
commercial products, while cryptography-based security requires support at both
endpoints of the communication. A more fundamental reason for the dominance
of fi rewalls is that they encapsulate security in a centralized place, in effect factoring
security out of the rest of the network. A system administrator can manage the
fi rewall to provide security, freeing the users and applications inside the fi rewall
from security concerns—at least some kinds of security concerns.
Unfortunately, fi rewalls have serious limitations. Since a fi rewall does not
restrict communication between hosts that are inside the fi rewall, the adversary who does manage to run code internal to a site can access all local hosts. How
might an adversary get inside the fi rewall? The adversary could be a disgruntled
employee with legitimate access. Or the adversary’s software could be hidden in
some software installed from a CD or downloaded from the Web. Or an adversary
could bypass the fi rewall by using wireless communication or telephone dial-up
connections.
Another problem is that any parties granted access through your fi rewall, such
as business partners or externally located employees, become a security vulnerability.
If their security is not as good as yours, then an adversary could penetrate
your security by penetrating their security.
Another problem for fi rewalls is that a service that appears safe to expose may
have a bug that makes it unsafe. A classic example is PHF, a phone booklike service
that was available on many websites for looking up names and addresses. A bufferoverfl
ow bug in PHF made it possible for anyone to execute an arbitrary command
on the web server by using her browser to enter the command in an input fi eld of
the PHF form. Such bugs are discovered regularly, so a system administrator has to
constantly monitor announcements of them. Administrators frequently fail to do
so, since fi rewall security breaches routinely exploit security fl aws that have been
known for some time and have straightforward solutions.
In addition to the (unintended) bugs that may be left accessible by a fi rewall,
there are also what could be thought of as intended, deliberate bugs. Malware
(malicious software) is software that is designed to act on a computer in ways concealed
from and unwanted by the computer’s user. Viruses, worms, and spyware
are common types of malware. ( “ Virus ” is sometimes used synonymously with malware,
but we will use it in the narrower sense in which it refers to only a particular
kind of malware.) Like buggy software, malware code need not be natively executable
object code; it could as well be interpreted code such as a script or an executable
macro such as those used by Microsoft Word.
Viruses and worms are characterized by the ability to make and spread copies
of themselves; the difference between them is that a worm is a complete program,
while a virus is a bit of code that is inserted (and inserts copies of itself) into
another piece of software, so that it is executed as part of the execution of that
piece of software. Viruses and worms typically cause problems such as consuming
network bandwidth as mere side effects of attempting to spread copies of themselves.
Even worse, they can also deliberately damage a system or undermine its
security in various ways. They could, for example, install a backdoor , which is software
that allows remote access to the system without the normal authentication.
This could lead to a fi rewall exposing a service that should be providing its own
authentication procedures but has been undermined by a backdoor.
Spyware is software that, without authorization, collects and transmits private
information about a computer system or its users. Usually spyware is secretly
embedded in an otherwise useful program, and is spread by users deliberately
installing copies. The problem for fi rewalls is that the transmission of the private
information looks like legitimate communication.

A natural question to ask is whether fi rewalls (or cryptographic security) could
keep malware out of a system in the fi rst place. Most malware is indeed transmitted
via networks, although it may also be transmitted via portable storage devices such
as CDs and memory sticks. One of the two approaches used by antimalware applications
is to observe programs for suspicious behavior as they execute—clearly
not feasible for a fi rewall that is not on the end-user machine. The other approach
is searching for segments of code from known malware, an approach already limited
by the ability of clever malware to tweak its representation in various ways.
The main problem with implementing this approach in a fi rewall is the impact on
network performance. Cryptographic security cannot eliminate the problem either,
although it does provide a means to authenticate the originator of a piece of software
and detect any tampering, such as when a virus inserts a copy of itself.







Cybersecurity: Is the U.S. Government doing enough?




Earlier this month, Department of Homeland Security (DHS) Secretary Janet Napolitano declared October National Cybersecurity Awareness Month. To be honest, I have not heard of Awareness Month before, even though it’s been around for six years.
The DHS blog post mentioned this year’s theme as being focused on having everyone, not just industry and government, practice what the DHS calls good “cyber hygiene”. Secretary Napolitano explains:
“Americans can follow a few simple steps to keep themselves safe online. By doing so, you will not only keep your personal assets and information secure but you will also help to improve the overall security of cyberspace.”
Initially, I took offense, “All computer users, not just industry and government, have a responsibility”. Sorry, who is making mistakes? Then, I realized, all of us are. Why, cybersecurity is new and uncharted territory. So, DHS wanting us to work together makes sense. To underscore that, DHS decided the theme for this year’s Awareness Month should be “Our Shared Responsibility”.
Experts are needed
During her speech, Secretary Napolitano mentioned that DHS has been given the authority to hire 1,000 new cybersecurity professionals. Brian Krebs in his Security Fix post quoted Secretary Napolitano as saying:
“This new hiring authority will enable DHS to recruit the best cyber analysts, developers, and engineers in the world to serve their country by leading the nation’s defenses against cyber threats.”
Secretary Napolitano further mentioned that:
“The department will look to fill critical cybersecurity roles, including cyber risk and strategic analysis, cyber incident response, vulnerability detection and assessment, intelligence and investigation, and network and systems engineering.”
Could be a problem
I was feeling pretty good about this. Then, I read IT pundit Bob Cringely’s article entitled “The Cybersecurity Myth“. He contends there aren’t 1000 cybersecurity experts available:
“The number of CCIE’s with security as a certification is 2,300 for the entire world. Subtract the 50 percent who work for Cisco, then 50 percent again for those not working in the field any longer, and you get 500 Cisco CCIE Security Experts worldwide. The only way to get another thousand in three years is by training them. But, in the last four months with 800 available seats to sit for the Cisco CCIE Security exam only one person has passed!”
One consultant that Mr. Cringely interviewed for his article disagreed, mentioning:
“Sure there are 1,000 (cybersecurity experts), but they are already employed… as hackers.”
That is typical Cringely “shock and awe”, but there is some truth to it.
President Obama’s turn
Sixteen days into National Cybersecurity Awareness Month, President Obama reaffirmed its importance in this video. He also echoed Secretary Napolitano’s call for everyone to do their part. In addition, President Obama proposed the following:
  • Networks are to be considered strategic national assets and will be protected as such.
  • A public-private partnership is needed to protect the privately-held infrastructure.
  • Each of us who use the Internet must take responsibility for our actions and equipment.
Buzz on the security blogs has these being good first steps.
Select a cybersecurity coordinator
President Obama mentioned in the video that he will soon select a cybersecurity coordinator, how soon? President Obama has been hinting at this since February. What many see as indecision has caused several people who were interested in the job to lose patience and withdraw their applications.
The latest to leave was White House senior aide on cybersecurity Melissa Hathaway. You may remember Ms. Hathaway as the person asked to review the federal government’s cyberspace policy. She resigned this past August. Ms. Hathaway candidly voiced her concern:
“I wasn’t willing to continue to wait any longer, because I’m not empowered right now to continue to drive the change. I’ve concluded that I can do more now from a different role,” most likely in the private sector.”
It doesn’t take much imagination to realize how difficult this job will be. I have asked several well-regarded CIOs and they had a tough time determining an appropriate job description. Let alone how much authority to give this new position.
Cybersecurity Act of 2009
Besides having a cybersecurity czar report directly to him, Congress may give President Obama unprecedented authority over private-sector Internet services and applications. Right now, Cybersecurity Act of 2009 is making its way through Congress. Bill co-sponsor Senator Olympia Snowe (R-Maine) explains:
“America’s vulnerability to massive cyber-crime, global cyber-espionage and cyber-attacks has emerged as one of the most urgent national security problems facing our country today. Importantly, this legislation loosely parallels the recommendations in the CSIS [Center for Strategic and International Studies] blue-ribbon panel report to President Obama and has been embraced by a number of industry and government thought leaders.”
Needless to say, several groups including the Electronic Frontier Foundation (EFF) have deeps concerns about this bill. They feel the bill would give the federal government too much power, to a point where the critical infrastructure in this country would be federalized.
The EFF is also concerned about the loss of individual privacy, offering the following bill provision as one example:
“The Secretary of Commerce shall have access to all relevant data concerning (critical infrastructure) networks without regard to any provision of law, regulation, rule, or policy restricting such access.”
The age-old “security versus privacy” challenge seems to be upon us once again.
Final thoughts
We all know the critical infrastructure of the United States is vulnerable to electronic attack. It’s also obvious that cybersecurity is complicated. What’s not so obvious to me is if enough is being done and the right decisions are being made. What do you think?


Michael KassnerMichael Kassner has been involved with with IT for over 30 years. Currently a systems administrator for an international corporation and security consultant with MKassner Net. Read his profile or Twitter at MKassnerNet.



The Cisco Architect: 'PhD of network architecture'


Only a handful of individuals across the globe have been certified in what is considered the top networking accreditation in the ICT industry after it was launched in June.
Speaking at the Cisco Networkers event in Brisbane at the end of September, Learning at Cisco marketing director, Fred Weiller, said the slow uptake resembled the introduction of the Cisco Certified Internetwork Expert's (CCIE) certificate 15 years ago.
The Cisco Certified Design Expert (CCDE) used to be considered the top networking certification until the vendor introduced the Cisco Certified Architect (CCA) ranking. The CCA certification costs US$15,000 and the CCDE is a prerequisite.
Weiller said although the number of CCAs remains in the single digits, there are around 15 individuals with the CCDE globally that are qualified to take the next step.
"From a projection standpoint globally, we're thinking that the number of CCAs will remain in the double digits for a few years and therefore it's safe to predict the number of architects in Australia will remain in the single figures," Weiller said.
The CCA certification was introduced to develop skills about understanding a business and its organisational goals, and then create a design or blueprint for a network that addresses said goals. Whereas the CCIE was originally created for implementation and troubleshooting of networks.
"Architects will have some very high-level people in front of them and some very high-stake decisions are based on their recommendations," Weiller said.
"In fact, millions of dollars will be committed based on an architect's recommendations, so they need to have the ability to communicate the value, advantages, options and what it means to reach the objectives to the relevant stakeholders."
Weiller urged potential candidates to think about the CCA in terms of a doctorate program, so much so he even referred to it as 'the PHD of Cisco'.
A solid 10-years experience in design and architecture is a recommended minimum, along with good writing and communication skills.
Firstly, a potential CCA is required to send Cisco a resume, along with a set of documents outlining the work they have already done in the design and architecture space during their career.
Based on that profile a panel of three judges decides whether a candidate is worthy of certification.
"If you pass the first filter, the panel provides you with an architectural challenge, in the form of a request for proposal. The goal is to really understand those business objectives translating into a technical infrastructure," Weiller said.
Once the candidate has proven to the judges they can meet the set architectural challenge, they are awarded certification. Re-certification for a CCA will be in the form of contributing back to the program and be a judge occasionally.
More information about becoming a CCA can be found on Cisco's website.
Thinking of becoming a CCA or know someone that does Email Computerworld or follow @computerworldau on Twitter.







Monday, October 26, 2009

Yale Adds New Batch of Free Open Courses

A quick update for you. Yale University has added its third batch of courses to its open education initiative, bringing the total number of courses to 25. The latest round is slightly bigger than previous ones, which bucks the trend that we’re generally seeing. (Open Courses have been in a noticeable slump for the past year.) Below, I have listed the newly added courses and provided links to iTunes, YouTube, and pages where you can download the courses in various other formats. This collection now features over 250 free courses, all ready to download to your computer or mp3 player.

Virtualisation to drive IT infrastructure: Gartner

Virtualisation will be the most important technology in IT infrastructures and operations up to 2010, according to Gartner, dramatically changing how IT departments manage, buy, deploy, plan and charge for their services.

Speaking at Gartner’s Infrastructure, Operations and Data Centre Summit in Sydney recently, Gartner vice president and distinguished analyst Thomas Bittman said that virtualisation was no longer only about server and storage consolidation and cost saving. “It is now less about the technology and more about process change and cultural change within organizations,” said Bittman. “Virtualisation enables alternative delivery models for services. Each virtualised layer can be managed relatively independently or even owned by someone else, for example, streamed applications or employee-owned PCs. This can require major cultural changes for organizations.”

The total number of virtual machines deployed worldwide is expected to increase from 540,000 at the end of 2006 to more than 4 million by 2009, according to Gartner, but this is still only a fraction of the potential market. “Several things will make virtualisation critical to most enterprises in the next few years: the need to consolidate space, power, installation and integration, and providing server resources which are capable of responding to unpredictable workloads,” Bittman said. “By the end of next year, virtual machine hypervisor technology will be almost free, embedded in servers by hardware manufacturers and in operating systems by software vendors, further accelerating adoption.”

Virtualisation is having a considerable impact on the server market worldwide, according to Gartner. “Every virtual server has the potential to take another physical server off the market,” said Bittman. “Today more than 90 per cent of users deploying virtual machines are doing so specifically to reduce x86 server, space and energy costs. We believe that virtualisation reduced the x86 server market by 4 per cent in 2006, and by 2009 it will have a far greater impact.”

Virtualisation on the PC has even more potential than server virtualisation to improve the management of IT infrastructure, according to Bittman. “Virtualisation on the client is perhaps two years behind, but it is going to be much bigger. On the PC, it is about isolation and creating a managed environment that the user can’t touch. This will help change the paradigm of desktop computer management in organizations. It will make the trend towards employee-owned notebooks more manageable, flexible and secure.”

Bittman said that the gap between the well managed and the badly managed IT infrastructure is growing. A November 2006 survey of 700 Gartner clients showed that most organizations are in the very early stages of their infrastructure and operations maturity. “Virtualisation without good management is more dangerous than not using virtualisation in the first place,” said Bittman. “Automation is the critical next step to help organizations stop ‘virtualisation sprawl’, which is not much better than server sprawl.”

Gartner recommends that organizations develop a vision for their own infrastructure and build a plan to get there. “Nothing beats infrastructure and operations when it comes to the ability to impact IT spending, staffing and business performance. Despite all the talk, IT infrastructure has not become a commodity. As technology vendors battle for control over your IT infrastructure, having a vision of your own will help you stay in control.”



RHCE and RHCT Frequently Asked Questions

1. Are the RHCE and RHCT Certification Exams open-book?

No. They are closed-book, with no notes or reference materials permitted other than those distributed with the Red Hat OS on which the exam is taken. Most standardized tests, including most IT certification exams, are closed-book.

   
2. I have heard the RHCE and RHCT Certification Exams described as "performance-based". What is meant by "performance-based" certification or "performance-based" testing?

The RHCE and RHCT exams are performance-based in two very specific and important senses. First, the actual performance of candidates is tested by requiring them to successfully complete installation, configuration, troubleshooting, and maintenance tasks similar to those they must complete on the job as system administrators. Second, we determine their performance on these tasks by whether their systems perform as specified in an objective and verifiable manner.

   


3. Why did Red Hat decide to make the RHCE and RHCT Certification Exams performance-based?

We designed the RHCE Certificate to be performance-based for one good reason: quality. We want the RHCE certificate to be a meaningful, serious certificate, proof of actual competency, unimpeachably better as a measure of actual skill than other OS certifications. When we introduced RHCT in January, 2003, we had the same goals of quality, and felt that RHCE had demonstrated the value of this approach.

Linux professionals want a certificate they can respect and which they know is challenging to earn. Employers need to feel confident when they hire an RHCE or RHCT that the person has demonstrated the skills and competencies required to administer Red Hat systems for critical roles. Consulting companies, VARs, and resellers with RHCEs and RHCTs on staff are able to make a better business case to their customers, and RHCEs themselves report greater confidence, greater success with their Linux implementations for customers or their employers.


4. What's the difference between RHCT and RHCE certifications?

An RHCT has proven technician-level competencies required to install, attach, configure, and manage new Red Hat systems on an existing production network. RHCTs are capable of performing the the core system administration common to all systems, regardless of whether they are workstations, servers, network devices, or some other kind of system. An RHCE has also proven these RHCT competencies, and has demonstrated that he or she can configure networking services and security on servers running a Red Hat OS. Please see our RHCE program page and the Exam Prep Guide for more specific information on RHCE and RHCT skills.

 

5. If the RHCT competencies are included within the RHCE Certification Exam, can I earn RHCT if I don't pass RHCE?

Yes, this is possible. When you take the RHCE Exam you are measured on the competencies for RHCT as part of RHCE. You cannot pass RHCE without passing the competencies for RHCT, since an RHCE must be able do everything an RHCT can do plus a lot more. Certain competencies are compulsory for RHCT, without which a pass is not possible for either RHCT or RHCE. Additional competencies are compulsory for RHCE, without which a pass is not possible for RHCE. Candidates taking the RHCE certification exam who do not demonstrate the competencies for RHCE may earn the RHCT if they demonstrate the RHCT-specific competencies in the RHCE exam.


6. What is the structure of the RHCE Certification Exam and what is required to pass it?

The RHCE Certification Exam is a single section exam lasting 3.5 hours. In order to earn RHCE, candidates must earn a score of 70 or higher on both the RHCT and RHCE items in the lab exam.

 

7. What is the structure of the RHCT Certification Exam and what is required to pass it?

The RHCT Certification Exam is a single section exam lasting 2.0 hours. In order to earn RHCT, candidates must earn a score of 70 or higher. Candidates in RHCE Certification Exams who fulfill these requirements, but do not fulfill the additional RHCE requirements earn RHCT.


8. Why did Red Hat drop multiple choice from Red Hat Enterprise Linux 3 and higher exams?

After nearly five years of delivering RHCE exams, we concluded that the time spent during the exam asking multiple choice questions would be better spent on performance-based tasks. Analysis of the data we have collected demonstrates that the performance-based sections of the exam were far more effective than the multiple choice section.


9. When do I receive my official results after taking an exam?

Exam results are emailed to candidates within 3 US business days, assuming they have provided accurate contact information. Unfortunately, some mail servers mistakenly treat results notifications as spam and filter them. Candidates who do not receive their results within 3 US business days should contact Red Hat at www.redhat.com/training/certification/comments.html.



10. What does Red Hat report when it sends exam results?

The RHCE Certification Exam is a single section exam lasting 3.5 hours. In order to earn RHCE, candidates must earn a score of 70 or higher on both the RHCT and RHCE items in the lab exam.

11. When do I get my certificate?

For your convenince you will be issued an electronic certificate that will be attached to your results email.



12. Why doesn't Red Hat send a hard copy certificate?

Electronic certificates provide several benefits. First, they can be sent at the same time that results are processed for immediate use by the person receiving the certificate. Second, they allow certificate holders to print multiple copies for use at home and at the office. Third, they make replacement faster and easier. And yes, they are also more cost-effective, which allows Red Hat to offer the RHCE Certification Exam at the same price today that it did when it launched the program in 1999. In addition, we believe that verification at www.redhat.com/training/certification/verify/ is far more valuable than a physical certificate. A hard copy certificate can be forged by anyone with a computer and a decent graphics program. Having a unique certificate number that is verifiable by its issuer (in this case, Red Hat), is far more authoritative, reliable, and valuable.

 

13. How often can I re-take the RHCE or RHCT Certification Exams?

You can re-take these exams as often as you wish. When taken again, exams must be taken in their entirety, and credit for successfully completed sections in previous exams is not carried forward.



14. Can I cram for the RHCE or RHCT Certification Exams?

No. Red Hat does not recommend cramming. The RHCE Exam is very different from most IT certification tests. It is possible to cram for a multiple-choice test. It is not possible to cram for a live system performance-based test, unless the "cramming" means getting real-world experience. Cramming will not turn an unqualified person into one who is qualified.

  

15. How can I do self-paced study for the RHCE Exam? Are there books or self-paced book and CD kits that Red Hat recommends?

Red Hat recommends its eLearning series as the best mechanism for self-paced study. Red Hat does not endorse and has not authorized any particular RHCE prep books or self-paced study programs. We do not recommend for or against any of these, as we do not have time to review these or measure their performance, and we will not endorse something without being able to vouch for its performance. Red Hat provides an RHCE and RHCT Exam Prep Guide for use by all persons who wish to pursue certification, including those who must prepare on their own.



16. What does Red Hat recommend to prepare for the RHCE and RHCT Certification Exams?

Red Hat recommends that persons interesting in preparing for the RHCE and RHCT exams 1) obtain high quality hands-on training such as is available in the Red Hat courses that are designed to cover the skill areas tested by the exams; 2) get hands-on, real-world experience with Red Hat Enterprise Linux systems administration; 3) make sure that the prerequisite networking skills specified in the Prep Guide are obtained before attempting the exams.

Both the RHCE and RHCT test professional-level system administration skills and such skills cannot be obtained through training alone. High quality hands-on training must be accompanied by real-world experience, preferably on the job. Good training can be a vital part of success, but the rest is up to the individual.

Whatever your method of preparation, use the RHCE and RHCT Exam Prep Guide, to guide your studies and practice, not third-party materials or other second-hand information, as the Prep Guide is the authoritative guide to what Red Hat tests in its exams.

   

17. For how long will my RHCE certification be considered current?

The validity period for all RHCEs and RHCTs is pegged to the release of the Enterprise product commercially available at the time certification was earned. RHCE and RHCT certifications are considered current until Red Hat retires exams of the release following the version on which your certification was earned. For example, certificates earned on Red Hat Enterprise Linux 3 will be current until August 31, 2007, the final date on which Red Hat Enterprise Linux 4 exams will be offered. Note that Red Hat Enterprise Linux 5 was released in March, months before the final retirement of the version 4 exams.

To provide further clarification for earlier versions, Red Hat Enterprise Linux 4 will remain current until Red Hat Enterprise Linux 5-based exams are retired, several months after the release of Red Hat Enterprise Linux 6. Certifications earned on Red Hat Linux 8.0 and Red Hat Linux 9 are pegged to Red Hat Enterprise Linux 3, and hence are no longer current.

While evidence suggests that RHCEs who stay professionally active can evolve their skills in pace with new releases of Red Hat Enterprise Linux technology, it is important for Red Hat to maintain a policy for determining whether an RHCE or RHCT certificate can be considered current. Thus, verification at Certification Central has always included the version a certificate was earned on, and whether the certificate is considered current or no longer current.

  

18. What are the benefits of getting certified by Red Hat?

Interviews and independent surveys have been conducted on RHCEs and the results substantiate what we have known all along: Performance-based certifications prove competency more meaningfully, and are accorded higher status than other types of certification. The benefits of RHCE for both individuals and their employers and managers are multiple:

    * Confidence and competence: RHCEs report greater confidence in their skills and better success at building and managing Linux servers. The actual quality of their work and their professionalism improve. They are better at performing their jobs for their employers, can take on more challenging assignments, and they receive recognition for this.
    * Career results: RHCEs interviewed also report one or more of the following within 90 days of earning the RHCE certificate: a new job, a raise, a promotion, increased responsibility, assignment to lead or supervisory role, increased recognition and/or prestige among colleagues.
    * Hard dollars: RHCE earn more because of their RHCE, as shown by two recent independent surveys: Computer Reseller News, and Certification Magazine. These surveys are linked at: Salary Surveys



19. What benefits of RHCE and RHCT are provided directly by Red Hat, and for how long?

Verification services for all certificate holders are provided at Certification Central. Certificate holders and their employers or customers can type in the 15-digit RHCE number and verify that the person is really certified.

RHCEs have exclusive access to RHCE Connection, a special site for RHCEs to receive special offers, discounts and benefits, as well as technical updates and access to resources. Access to RHCE Connection and its services is a value-added benefit provided at Red Hat's sole discretion. RHCTs similarly have access to RHCT Connection.

Certain other discretionary benefits of certification, such as partner programs between Red Hat and the company at which you may be employed, may require you to maintain certification on the most recent major release or otherwise on a more frequent basis than Red Hat GLS stated policy for individuals. This policy ensures a high standard of practice by the Red Hat partner company. It is up to you and/or your employer to stay up to date on the eligibility requirements of such programs.



20. How do I get re-certified for a new release of Red Hat Enterprise Linux?

Take and pass the certification exam on that new release.



21. When should I consider getting re-certified?

Re-certification is largely a matter of your own choice and that of any employer or customer who may have an interest in how current your certificate is. Know your market: if the installed base you service is in a hurry to upgrade to the newest release, or requires features and services in the latest release, then it may be time to re-certify. Regardless of whether you decide to re-certify you can keep your skills current by using and learning each new version of Red Hat Enterprise Linux.

 

22. Where can I take the RHCE and RHCT Exams?

Red Hat has training facilities at its Raleigh, North Carolina headquarters and worldwide through its own offices and through its Red Hat Certified Training Partners. Contact your nearest Red Hat office for additional information.

   


23. Will training provided by other training vendors who are not Red Hat Certified Training Partners be useful preparation for RHCE?

Unless the training is being provided by Red Hat, Inc., or a Red Hat Certified Training Partner, authorized by Red Hat, Inc., Red Hat, Inc., cannot endorse it or vouch for it. Only by taking the courses in the RHCE Program offered by Red Hat, Inc. and Red Hat Certified Training Partners do you have a guarantee that the content, instruction, and design of the courses and curriculum will be up-to-date, professional, and geared for the RHCE program.

  

24. How does the standard track of skills courses (RH033, RH133, RH253) relate to the RH300 and RH301 Rapid Track RHCE courses?

RH300 and RH301 are accelerated training courses for experienced Linux and UNIX systems administrators. RH300 includes the RHCE Certification Exam on the last of five days; RH301 does not. Only course participants with either Linux system administration experience or considerable UNIX system administration including networking services should take RH300 because of the highly accelerated pace.

In contrast the suite of RH033, RH133, and RH253 provides a more gradual path for building skills. More time is spent on each topic, and participants are assumed to be doing most of the tasks for the first time. Make no mistake, however: these courses are not "fluff". People who attend them often find the pace and quantity of information challenging compared to other IT training they have attended.

   

25. Will there be other certifications created by Red Hat?

Yes, Red Hat has expanded advanced training beyond the level of RHCE to create the Enterprise Architect curriculum and Red Hat Certified Architect (RHCA) certification. See Enterprise Architect/RHCA



26. What is the Goal of RHCE and RHCT Certification?

The primary goal of RHCE and RHCT certification is to meet the demand of individuals and employers for useful metrics of individual skills and competencies with Red Hat Linux, the largest-selling distribution of Linux. The RHCE Program provides performance based certification at two critical job role levels: Technician (RHCT) and Engineer (RHCE).

RHCE and/or RHCT may be required for selected personnel employed at Red Hat channel partners, IHVs, ISVs, OEMs, and other partners, to provide meaningful assurance of standards.

 

27. What is the Meaning of RHCE and RHCT Certification?

RHCE or RHCT certification serve as a metric (hopefully one of many) of use to both individuals and employers to assess individual preparation and competency for key job roles involving Red Hat Linux computing.

(a) RHCE certification indicates that the person has passed a realistic performance-based lab exam that tests his/her ability to: install and configure Red Hat Linux; understand limitations of hardware; configure basic networking and file systems for a network; configure the X Window System; perform essential Red Hat Linux system administration; configure basic security for a network server; set up and manage common enterprise networking (IP) services for the organization, carry out server diagnostics and troubleshooting.

The readiness objective of RHCE is to assure standard level of systems and network administration skills so that a person is "ready from a technical point of view for professional responsibilities in setting up, configuring, and managing a Red Hat Linux server running common enterprise networking services and security."

(b) RHCT certification indicates that the person has passed a realistic performance-based lab exam that tests his/her ability to: install and configure Red Hat Linux; understand limitations of hardware; configure basic networking and file systems for a single system attached to a network; configure the X Window System; perform essential Red Hat Linux system administration; configure basic host security, set up client-side networking services required to attach to a production network, and carry out basic diagnostics and troubleshooting.

The readiness objective of RHCT is to assure a minimum level of systems administration skills so that a person is "ready from a technical point of view for professional responsibilities in installing, configuring, attaching, and supporting Red Hat Linux systems on an existing production network."

  


28. What is Certification in the context of professionalism?

Becoming a successful technician or engineer requires years of experience in heterogenous, networked computing environments, coping with day-to-day issues, and developing best practices. This kind of experience does not result from taking one course or exam, but it can be measured during training and in a certification exam, especially if these are administered using hands-on exercises in a lab. That's why the RHCE and RHCT exams are lab-based, performance-based practical exams.

Red Hat is benchmarking the RHCE and RHCT certificates to be useful metrics for measuring experience, skill, and competency with Red Hat Linux, and for demonstrating preparedness for professional responsibilities at two critical levels of Red Hat Linux systems administration.

 

29. What is the Verification and Validity Period?

Red Hat provides complete verification of RHCE certification, including version numbers, at Certification Central, so that individuals, their employers and customers can make their own informed decisions based on what version they are actually running and how critical re-certification is for their own requirements.

The validity period for all RHCEs and RHCTs is pegged to the release of the Enterprise product commercially available at the time certification was earned. RHCE and RHCT certifications are considered current until Red Hat retires exams of the release following the version on which your certification was earned. For example, certificates earned on Red Hat Enterprise Linux 3 will be current until August 31, 2007, the last date on which Red Hat Enterprise Linux 4 exams will be offered. Note that Red Hat Enterprise Linux 5 was released in March, months before the final retirement of the version 4 exams.

To provide further clarification for earlier versions, Red Hat Enterprise Linux 4 will remain current until Red Hat Enterprise Linux 5 exams are retired, several months after the release of Red Hat Enterprise Linux 6. Certifications earned on Red Hat Linux 8.0 and Red Hat Linux 9 are pegged to Red Hat Enterprise Linux 3, and hence will no longer be current after August 31, 2007.

Our information suggests that the RHCE is such a strong certification that RHCEs in continuous practice as professionals are likely to be able to keep their skill levels up in pace with Red Hat Linux technology. Some Red Hat partner programs mandate RHCEs maintain certification on the most recent release.

WWW ROBOTS.....U NEED TO KNOW ABOUT...

What is a WWW robot?


A robot is a program that automatically traverses the Web's hypertext structure by retrieving a document, and recursively retrieving all documents that are referenced.

Note that "recursive" here doesn't limit the definition to any specific traversal algorithm; even if a robot applies some heuristic to the selection and order of documents to visit and spaces out requests over a long space of time, it is still a robot.

Normal Web browsers are not robots, because they are operated by a human, and don't automatically retrieve referenced documents (other than inline images).

Web robots are sometimes referred to as Web Wanderers, Web Crawlers, or Spiders. These names are a bit misleading as they give the impression the software itself moves between sites like a virus; this not the case, a robot simply visits sites by requesting documents from them.


What is an agent?



The word "agent" is used for lots of meanings in computing these days. Specifically:

Autonomous agents
    are programs that do travel between sites, deciding themselves when to move and what to do. These can only travel between special servers and are currently not widespread in the Internet.
Intelligent agents
    are programs that help users with things, such as choosing a product, or guiding a user through form filling, or even helping users find things. These have generally little to do with networking.
User-agent
    is a technical name for programs that perform networking tasks for a user, such as Web User-agents like Netscape Navigator and Microsoft Internet Explorer, and Email User-agent like Qualcomm Eudora etc.


What is a search engine?


A search engine is a program that searches through some dataset. In the context of the Web, the word "search engine" is most often used for search forms that search through databases of HTML documents gathered by a robot

Robots can be used for a number of purposes:

    * Indexing
    * HTML validation
    * Link validation
    * "What's New" monitoring
    * Mirroring

So what are Robots, Spiders, Web Crawlers, Worms, Ants?


They're all names for the same sort of thing, with slightly different connotations:

Robots
    the generic name, see above.
Spiders
    same as robots, but sounds cooler in the press.
Worms
    same as robots, although technically a worm is a replicating program, unlike a robot.
Web crawlers
    same as robots, but note WebCrawler is a specific robot
WebAnts
    distributed cooperating robots.



Aren't robots bad for the web?


There are a few reasons people believe robots are bad for the Web:

    * Certain robot implementations can (and have in the past) overloaded networks and servers. This happens especially with people who are just starting to write a robot; these days there is sufficient information on robots to prevent some of these mistakes.
    * Robots are operated by humans, who make mistakes in configuration, or simply don't consider the implications of their actions. This means people need to be careful, and robot authors need to make it difficult for people to make mistakes with bad effects.
    * Web-wide indexing robots build a central database of documents, which doesn't scale too well to millions of documents on millions of sites.

But at the same time the majority of robots are well designed, professionally operated, cause no problems, and provide a valuable service in the absence of widely deployed better solutions.

So no, robots aren't inherently bad, nor inherently brilliant, and need careful attention.



How does a robot decide where to visit?


This depends on the robot, each one uses different strategies. In general they start from a historical list of URLs, especially of documents with many links elsewhere, such as server lists, "What's New" pages, and the most popular sites on the Web.

Most indexing services also allow you to submit URLs manually, which will then be queued and visited by the robot.

Sometimes other sources for URLs are used, such as scanners through USENET postings, published mailing list achives etc.

Given those starting points a robot can select URLs to visit and index, and to parse and use as a source for new URLs.



How does an indexing robot decide what to index?


If an indexing robot knows about a document, it may decide to parse it, and insert it into its database. How this is done depends on the robot: Some robots index the HTML Titles, or the first few paragraphs, or parse the entire HTML and index all words, with weightings depending on HTML constructs, etc. Some parse the META tag, or other special hidden tags.

We hope that as the Web evolves more facilities becomes available to efficiently associate meta data such as indexing information with a document. This is being worked on...


How do I register my page with a robot?


You guessed it, it depends on the service :-) Many services have a link to a URL submission form on their search page, or have more information in their help pages. For example, Google has Information for Webmasters.



This is referred to as "SEO" -- Search Engine Optimisation. Many web sites, forums, and companies exist that aim/claim to help with that.

But it basically comes down to this:

    * In your site design, use text rather than images and Flash for important content
    * Make your site work with JavaScript, Java and CSS disabled
    * Organise your site such that you have pages that focus on a particular topic
    * Avoid HTML frames and iframes
    * Use normal URLs, avoiding links that look like form queries (http://www.example.com/engine?id)
    * Market your site by having other relevant sites link to yours
    * Don't try to cheat the system (by stuffing your pages of keywords, or attempting to target specific content at search engines, or using link farms)




Can I use /robots.txt or meta tags to remove offensive content on some other site from a search engine?


No, because those tools can only be used by the person controlling the content on that site.

You will have to contact the site and ask them to remove the offensive content, and ask them to take steps to remove it from the search engine too. That usually involves using /robots.txt, and then using the search engine's tools to request the content to be removed.



How do I know if I've been visited by a robot?


You can check your server logs for sites that retrieve many documents, especially in a short time.

If your server supports User-agent logging you can check for retrievals with unusual User-agent header values.

Finally, if you notice a site repeatedly checking for the file '/robots.txt' chances are that is a robot too.



A robot is traversing my whole site too fast!


This is called "rapid-fire", and people usually notice it if they're monitoring or analysing an access log file.

First of all check if it is a problem by checking the load of your server, and monitoring your servers' error log, and concurrent connections if you can. If you have a medium or high performance server, it is quite likely to be able to cope a high load of even several requests per second, especially if the visits are quick.

However you may have problems if you have a low performance site, such as your own desktop PC or Mac you're working on, or you run low performance server software, or if you have many long retrievals (such as CGI scripts or large documents). These problems manifest themselves in refused connections, a high load, performance slowdowns, or in extreme cases a system crash.

If this happens, there are a few things you should do. Most importantly, start logging information: when did you notice, what happened, what do your logs say, what are you doing in response etc; this helps investigating the problem later. Secondly, try and find out where the robot came from, what IP addresses or DNS domains.If you can identify a site this way, you can email the person responsible, and ask them what's up. If this doesn't help, try their own site for telephone numbers, or mail postmaster at their domain.


How do I prevent robots scanning my site?



The quick way to prevent robots visiting your site is put these two lines into the /robots.txt file on your server:

User-agent: *
Disallow: /

but this only helps with well-behaved robots.