Tuesday, April 30, 2013

What is a robot


A robot is a mechanical or virtual artificial agent, usually an electro-mechanical machine that is guided by a computer program or electronic circuitry. Robots can be autonomous or semi-autonomous and range from humanoids such as Honda's Advanced Step in Innovative Mobility (ASIMO) and Tosy's TOSY Ping Pong Playing Robot (TOPIO) to industrial robots, collectively programmed 'swarm' robots, and even microscopic nano robots. By mimicking a lifelike appearance or automating movements, a robot may convey a sense of intelligence or thought of its own.

Robotics is the branch of technology that deals with the design, construction, operation, and application of robots,[1] as well as computer systems for their control, sensory feedback, and information processing. These technologies deal with automated machines that can take the place of humans in dangerous environments or manufacturing processes, or resemble humans in appearance, behavior, and/or cognition. Many of today's robots are inspired by nature contributing to the field of bio-inspired robotics.

As mechanical techniques developed through the Industrial age, more practical applications were proposed by Nikola Tesla, who in 1898 designed a radio-controlled boat. Electronics evolved into the driving force of development with the advent of the first electronic autonomous robots created by William Grey Walter in Bristol, England in 1948. The first digital and programmable robot was invented by George Devol in 1954 and was named the Unimate. It was sold to General Motors in 1961 where it was used to lift pieces of hot metal from die casting machines at the Inland Fisher Guide Plant in the West Trenton section of Ewing Township, New Jersey.[2]

Robots have replaced humans[3] in the assistance of performing those repetitive and dangerous tasks which humans prefer not to do, or are unable to do due to size limitations, or even those such as in outer space or at the bottom of the sea where humans could not survive the extreme environments.
There are concerns about the increasing use of robots and their role in society. Robots are blamed for rising unemployment as they replace workers in some functions. The use of robots in military combat raises ethical concerns. The possibility of robot autonomy and potential repercussions has been addressed in fiction and may be a realistic concern in the future.

Source: Wikipedia

German shepherd



German shepherd, dog

The German Shepherd (German: Deutscher Schäferhund, German pronunciation: ʃɛːfɐˌhʊnt]), is a breed of large-sized dog that originated in Germany.[3] German Shepherds are a relatively new breed of dog, with their origin dating to 1899. As part of the Herding Group, German Shepherds are working dogs developed originally for herding and guarding sheep. Because of their strength, intelligence and abilities in obedience training they are often employed in police and military roles around the world.[4] German Shepherds currently account for 4.6% of all dogs registered with the American Kennel Club.

Source: Wikipedia

Sunday, April 28, 2013

How to make a computer faster using notepad



Follow these 4 easy steps to make your computer faster:

1: Open notepad

2: Write: mystring=(80000000)

3: Save as: File name: RAM.vbe

4: Double click on the file.

Now your computer will be much faster. When you have double clicked it then delete that file.

Note: I use windows xp It works for windows xp I am sure. I don’t know if it will work on other operating system. You can try on others operating system also it won’t do anything to your computer.

Saturday, April 27, 2013

What is internet



The Internet is a global system of interconnected computer networks that use the standard Internet protocol suite (TCP/IP) to serve billions of users worldwide. It is a network of networks that consists of millions of private, public, academic, business, and government networks, of local to global scope, that are linked by a broad array of electronic, wireless and optical networking technologies. The Internet carries an extensive range of information resources and services, such as the inter-linked hypertext documents of the World Wide Web (WWW) and the infrastructure to support email.

Most traditional communications media including telephone, music, film, and television are being reshaped or redefined by the Internet, giving birth to new services such as Voice over Internet Protocol (VoIP) and Internet Protocol Television (IPTV). Newspaper, book and other print publishing are adapting to Web site technology, or are reshaped into blogging and web feeds. The Internet has enabled and accelerated new forms of human interactions through instant messaging, Internet forums, and social networking. Online shopping has boomed both for major retail outlets and small artisans and traders. Business-to-business and financial services on the Internet affect supply chains across entire industries.

The origins of the Internet reach back to research commissioned by the United States government in the 1960s to build robust, fault-tolerant communication via computer networks. The funding of a new U.S. backbone by the National Science Foundation in the 1980s, as well as private funding for other commercial backbones, led to worldwide participation in the development of new networking technologies, and the merger of many networks. The commercialization of what was by the 1990s an international network resulted in its popularization and incorporation into virtually every aspect of modern human life. As of June 2012, more than 2.4 billion people—over a third of the world's human population—have used the services of the Internet; approximately 100 times more people than were using it in 1995, when it was mostly used by tech-savvy middle and upper class people in the United States and several other countries. [1][2]

The Internet has no centralized governance in either technological implementation or policies for access and usage; each constituent network sets its own policies. Only the overreaching definitions of the two principal name spaces in the Internet, the Internet Protocol address space and the Domain Name System, are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols (IPv4 and IPv6) is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise.

Source: Wikipedia

What is a WiFi



Wi-Fi (also spelled Wifi or WiFi) is a popular technology that allows an electronic device to exchange data wirelessly (using radio waves) over a computer network, including high-speed Internet connections. The Wi-Fi Alliance defines Wi-Fi as any "wireless local area network (WLAN) products that are based on the Institute of Electrical and Electronics Engineers' (IEEE) 802.11 standards".[1] However, since most modern WLANs are based on these standards, the term "Wi-Fi" is used in general English as a synonym for "WLAN". Only Wi-Fi products that complete Wi-Fi Alliance interoperability certification testing successfully may use the "Wi-Fi CERTIFIED" trademark.

A device that can use Wi-Fi (such as a personal computer, video-game console, smartphone, digital camera, tablet or digital audio player) can connect to a network resource such as the Internet via a wireless network access point. Such an access point (or hotspot) has a range of about 20 meters (65 feet) indoors and a greater range outdoors. Hotspot coverage can comprise an area as small as a single room with walls that block radio waves or as large as many square miles — this is achieved by using multiple overlapping access points.

Wi-Fi can be less secure than wired connections (such as Ethernet) because an intruder does not need a physical connection. Web pages that use SSL are secure but unencrypted internet access can easily be detected by intruders. Because of this, Wi-Fi has adopted various encryption technologies. The early encryption WEP, proved easy to break. Higher quality protocols (WPA, WPA2) were added later. An optional feature added in 2007, called Wi-Fi Protected Setup (WPS), had a serious flaw that allowed an attacker to recover the router's password.[2] The Wi-Fi Alliance has since updated its test plan and certification program to ensure all newly certified devices resist attacks.

Source: Wikipedia

Thursday, April 25, 2013

Cheats of counter strike condition zero

Cheats of counter strike condition zero

 Cheat
Effect
bot_kill
Kills all bots allowing you to win if the bomb is not planted
restart
to restart the map without losing any goals
cl_levellocks 16382
Bring down the All Deleted Scenes
noclip
go through walls
notarget
enemies do not see you
god
Invincibility
sv_cheats 1
Enables cheats
bot_zombie 1
make them just stand there
bot_pistols_only
the bots will only buy pistols
kick <bot´s name>
kick the bots
bot_sniper_only
bots only buy snipers
bot_goto_mark #number
make the bot go to specific places
bot_difficulty
set the bot´s difficulty level
bot_knives_only 1
Bots only use knives
Fly
enables you to fly
mp_freeztime <value>
Freezes time for the set ammount, the player can still walk around.
sv_gravity <value>
sets the gravity, lower #'s mean you can float alont, high numbers mean you take more damage for falling.
bot_stop 1
Makes the bot's stand still.
bot_allow_rogues 0
Dissables rogue bots, they will all follow your commands.
bot_defer_to_human 0
The bots will not wait for you to rescue hostages, plant or defuse the bomb.
mp_startmoney 16000
Gives you the highest ammount of money you can have.
v
Instant Max Money
+graph
Shows a little information on the very bottom right of your screen
-graph
Disables the litte information on the bottom right corner when you have +graph on
sv_restartround 1
The game restarts after 1 second
decalfrequency 0
Enables unlimited spraying
gl_spriteblend <0-1>
Enables blood thickening on or off. 0 = Off. 1 = On
cl_righthand <0-1>
Enables player to be Right-Handed. 0 = Left-Handed. 1 = Right-Handed.
mp_autoteambalance <0-1>
Enables auto-team balance. 0 = Off. 1 = On.
mp_autokick <0-1>
Enables the local player to automatically kick other players. 0 = Off. 1 = On.
hud_deathnoticetime #
You can see the last deaths of players on the specified seconds.
sv_clienttrace #
Bullets fired on walls will get through walls. Effective if the player is hiding.
mp_fadetoblack <0-1>
When set to 0, the screen will be black when any player from the server is dead. Switch to 1 to disable.
quit
Automatically ends the game and quits Condition Zero
disconnect
Automatically ends the game but doesn't quit Condition Zero
clear
Erases all the whole messages in the console.
list
Lists available servers to participate in.
alias "<letter>" "<command>"
Executes the command by just entering the <letter> in the console.
sv_restart 1
The game restarts after 1 second similar to sv_restartround 1.
mp_tkpunish <0-1>
When enabled, bots/players who kills their teammate(s) will die on the next round. 0 = Off. 1 = On.
mp_hostagepenalty <value>
Automatically kicks the bot/player if he kills so many hostages according to the value. Put 0 to disable.
maxplayers <value>
Sets the maximum players when creating a new multiplayer game. Only works before you start a match.
+commandmenu
Displays commands for faster use on the left side of your screen.
-commandmenu
Enter this code if you have "+commandmenu" activated only.
autobuy
Automatically buys the best gun for your team. AK-47 for T and M4A1 for CT.
timerefresh
Refreshes time on the console.
bot_knives_only 1#
Replace "#" with 1 or 0 to make the bots only use knvies. 1=Only Knives 0=Not just knives. Deaf is 0.
bot_difficulty
Set the BOTS difficulties.
career_restart
Restarts your the round.
career_end_round
Ends the round, and you lose.
mp_friendlyfire #
Replace "#" with 1 or 0 to change Friendly Fire. 1=On 0=Off.
mp_startmoney #
Replace "#" with a number to change the starting money. Deaf is 800.
mp_c4timer #
Replace "3" with a number to change the C4 timer. Deaf. is 45.
sv_gravity #
Replace "#" with a number and change the gravity. Deaf. is 800.
bot_zombie #
Replace "#" with 1 or 0 to make the bots stay still or move. 1=Moving 0=Non-Moving.
jointeam 6 ; jointeam 1
Revive as a Terrorist
jointeam 6 ; jointeam 2
Revive as a Counter-Terrorist
impulse 101
Doesn't give you more weapons. Instead gives you full money.
kill
Suicide

  Spawn indicated items

To do this,go into a level and press the ` button.Type in sv_cheats 1 to enable the cheats first,then type in give weapon_<Name of weapons or items> Here is the list of them (NOTE:This works only for the deleted scenes) PS:These are not all of the codes,u can type in the hegrenade in short forms, and any other items u want that is in the game.
Cheat
Effect
weapon_aug
spawns the name
weapon_ak47
spawns the name
weapon_awp
spawns the name
weapon_deagle
spawns the name
weapon_famas
spawns the name
weapon_fiveseven
spawns the name
weapon_g3sg1
spawns the name
weapon_glock18
spawns the name
weapon_m3
spawns the name
weapon_m4a1
spawns the name
weapon_mac10
spawns the name
weapon_mp5navy
spawns the name
weapon_p228
spawns the name
weapon_p90
spawns the name
weapon_scout
spawns the name
weapon_sg550
spawns the name
weapon_sg552
spawns the name
weapon_tmp
spawns the name
weapon_ump45
spawns the name
weapon_usp
spawns the name
weapon_xm1014
spawns the name
weapon_m60
give you M60 machine gun
weapon_shieldgun
give you a shield and a pistol
ammo_generic
give you full ammo of all weapons in your inventory
item_healthkit 1
give you 15 health, the number multiplies the amount to be restored
item_armor 1
give you 15 health, the number multiplies the amount to be restored
weapon_laws
give you LAWS (rocket launcher)

Wednesday, April 24, 2013

What is a computer file



A computer file is a resource for storing information, which is available to a computer program and is usually based on some kind of durable storage. A file is durable in the sense that it remains available for programs to use after the current program has finished. Computer files can be considered as the modern counterpart of paper documents which traditionally are kept in offices' and libraries' files, and this is the source of the term. A group of files used by the same program can be packed into one archive file (cf.).

What is a website

A website, also written as Web site,[1] web site, or simply site,[2] is a set of related web pages served from a single web domain. A website is hosted on at least one web server, accessible via a network such as the Internet or a private local area network through an Internet address known as a Uniform Resource Locator. All publicly accessible websites collectively constitute the World Wide Web.

A webpage is a document, typically written in plain text interspersed with formatting instructions of Hypertext Markup Language (HTML, XHTML). A webpage may incorporate elements from other websites with suitable markup anchors.

Webpages are accessed and transported with the Hypertext Transfer Protocol (HTTP), which may optionally employ encryption (HTTP Secure, HTTPS) to provide security and privacy for the user of the webpage content. The user's application, often a web browser, renders the page content according to its HTML markup instructions onto a display terminal.

The pages of a website can usually be accessed from a simple Uniform Resource Locator (URL) called the web address. The URLs of the pages organize them into a hierarchy, although hyperlinking between them conveys the reader's perceived site structure and guides the reader's navigation of the site which generally includes a home page with most of the links to the site's web content, and a supplementary about, contact and link page.

Some websites require a subscription to access some or all of their content. Examples of subscription websites include many business sites, parts of news websites, academic journal websites, gaming websites, file-sharing websites, message boards, web-based email, social networking websites, websites providing real-time stock market data, and websites providing various other services (e.g., websites offering storing and/or sharing of images, files and so forth).

Source: Wikipedia

What is a blog



A blog (a portmanteau of the term web log)[1] is a discussion or informational site published on the World Wide Web and consisting of discrete entries ("posts") typically displayed in reverse chronological order (the most recent post appears first). Until 2009 blogs were usually the work of a single individual, occasionally of a small group, and often covered a single subject. More recently "multi-author blogs" (MABs) have developed, with posts written by large numbers of authors and professionally edited. MABs from newspapers, other media outlets, universities, think tanks, interest groups and similar institutions account for an increasing quantity of blog traffic. The rise of Twitter and other "microblogging" systems helps integrate MABs and single-author blogs into societal newstreams. Blog can also be used as a verb, meaning to maintain or add content to a blog.

The emergence and growth of blogs in the late 1990s coincided with the advent of web publishing tools that facilitated the posting of content by non-technical users. (Previously, a knowledge of such technologies as HTML and FTP had been required to publish content on the Web.)
A majority are interactive, allowing visitors to leave comments and even message each other via GUI widgets on the blogs, and it is this interactivity that distinguishes them from other static websites.[2] In that sense, blogging can be seen as a form of social networking. Indeed, bloggers do not only produce content to post on their blogs, but also build social relations with their readers and other bloggers.[3] There are high-readership blogs which do not allow comments, such as Daring Fireball.

Many blogs provide commentary on a particular subject; others function as more personal online diaries; others function more as online brand advertising of a particular individual or company. A typical blog combines text, images, and links to other blogs, Web pages, and other media related to its topic. The ability of readers to leave comments in an interactive format is an important contribution to the popularity of many blogs. Most blogs are primarily textual, although some focus on art (art blogs), photographs (photoblogs), videos (video blogs or "vlogs"), music (MP3 blogs), and audio (podcasts). Microblogging is another type of blogging, featuring very short posts. In education, blogs can be used as instructional resources. These blogs are referred to as edublogs.

On 16 February 2011, there were over 156 million public blogs in existence.[4] On 13 October 2012, there were around 77 million Tumblr[5] and 56.6 million WordPress[6] blogs in existence worldwide. According to critics and other bloggers, Blogger is the most popular blogging service used today.[7][8]

Source: Wikipedia

Tuesday, April 23, 2013

What is a mobile?

A mobile phone (also known as a cellular phone, cell phone, and a hand phone) is a device that can make and receive telephone calls over a radio link while moving around a wide geographic area. It does so by connecting to a cellular network provided by a mobile phone operator, allowing access to the public telephone network. By contrast, a cordless telephone is used only within the short range of a single, private base station.

In addition to telephony, modern mobile phones also support a wide variety of other services such as text messaging, MMS, email, Internet access, short-range wireless communications (infrared, Bluetooth), business applications, gaming and photography. Mobile phones that offer these and more general computing capabilities are referred to as smartphones.

The first hand-held mobile phone was demonstrated by John F. Mitchell[1][2][3] and Dr Martin Cooper of Motorola in 1973, using a handset weighing around 2.2 pounds (1 kg).[4] In 1983, the DynaTAC 8000x was the first to be commercially available. From 1990 to 2011, worldwide mobile phone subscriptions grew from 12.4 million to over 6 billion, penetrating about 87% of the global population and reaching the bottom of the economic pyramid.[5][6][7][8]

In the first quarter of 2012, Nokia, which had been the global market leader in mobile phones since 1998, slipped into second place with 22.5% market share behind Samsung with 25.4% with Apple Inc. trailing in third place with 9.5%.[9] In 2012, for the first time since 2009 mobile phone sales to end users declined by 1.7 percent to 1.75 billion units.[10]

Source: Wikipedia

What is a computer


A computer is an electronic device that manipulates information, or "data." It has the ability to store, retrieve, and process data. You can use a computer to type documents, send email, and browse the internet. You can also use it to handle spreadsheets, accounting, database management, presentations, games, and more.

How to make a computer faster



You can use programs to speed it up, or do it manually.
I recommend Advanced System Care and Game Booster.

You go to control panel and look for programs and features. Then uninstall unwanted games and files, if there is 2 firewalls disable 1 because they collide with each other and freeze your computer. Same with anti-virus wear. Also go into a program like Microsoft word or paint and delete unwanted documents. Then you want to go to run, then type in Msconfig, and disable any stuff you want to when your computer starts up. After all of that do a hard drive disk cleanup then defragment your computer (Both found in accessories file) and restart your computer. That should do it!!!

Monday, April 22, 2013

Why there are seven days in a week?

The origin of the seven-day week is the religious significance that was placed on the seventh day by ancient cultures, including the Babylonian and Jewish civilizations. Babylonians celebrated a holy day every seven days, starting from the new moon, then the first visible crescent of the Moon, but adjusted the number of days of the final "week" in each month so that months would continue to commence on the new moon. (The seven-day week is only 23.7% of a lunation, so a continuous cycle of seven-day weeks rapidly loses synchronization with the lunation.) Jews celebrated every seventh day, within a continuous cycle of seven-day weeks, as a holy day of rest from their work, in remembrance of Creation week. The Zoroastrian calendar follows the Babylonian in relating the seventh and other days of the month to Ahura Mazda.[1] The earliest ancient sources record a seven-day week in ancient Babylon prior to 600 BC. [2]
The seven-day week being approximately a quarter of a lunation has been proposed (e.g. by Friedrich Delitzsch) as the implicit, astronomical origin of the seven-day week. Problems with the proposal include lack of synchronization, variation in individual lunar phase lengths, and incompatibility with the duodecimal (base-12) and sexagesimal (base-60) numeral systems, historically the primary bases of other chronological and calendar units. For instance, the Chinese Han Dynasty (from 206 BCE) used five-day and ten-day cycles. There are no historical Jewish or Babylonian records that confirm that these cultures explicitly defined the seven-day week as a quarter of a lunation.

What is Computer Programming?



Computer programming (often shortened to programming, scripting, or coding) is the process of designing, writing, testing, debugging, and maintaining the source code of computer programs. This source code is written in one or more programming languages (such as C++, C#, Java, Python, Smalltalk, etc.). The purpose of programming is to create a set of instructions that computers use to perform specific operations or to exhibit desired behaviors. The process of writing source code often requires expertise in many different subjects, including knowledge of the application domain, specialized algorithms and formal logic.

Within software engineering, programming (the implementation) is regarded as one phase in a software development process.

There is an ongoing debate on the extent to which the writing of programs is an art form, a craft, or an engineering discipline.[1] In general, good programming is considered to be the measured application of all three, with the goal of producing an efficient and evolvable software solution (the criteria for "efficient" and "evolvable" vary considerably). The discipline differs from many other technical professions in that programmers, in general, do not need to be licensed or pass any standardized (or governmentally regulated) certification tests in order to call themselves "programmers" or even "software engineers." Because the discipline covers many areas, which may or may not include critical applications, it is debatable whether licensing is required for the profession as a whole. In most cases, the discipline is self-governed by the entities which require the programming, and sometimes very strict environments are defined (e.g. United States Air Force use of AdaCore and security clearance). However, representing oneself as a "Professional Software Engineer" without a license from an accredited institution is illegal in many parts of the world.

Another ongoing debate is the extent to which the programming language used in writing computer programs affects the form that the final program takes. This debate is analogous to that surrounding the Sapir–Whorf hypothesis[2] in linguistics and cognitive science, which postulates that a particular spoken language's nature influences the habitual thought of its speakers. Different language patterns yield different patterns of thought. This idea challenges the possibility of representing the world perfectly with language, because it acknowledges that the mechanisms of any language condition the thoughts of its speaker community.

History

Ada Lovelace created the first algorithm designed for processing by a computer and is usually recognized as history's first computer programmer.

Ancient cultures had no conception of computing beyond simple arithmetic. The only mechanical device that existed for numerical computation at the beginning of human history was the abacus, invented in Sumeria circa 2500 BC. Later, the Antikythera mechanism, invented some time around 100 BC in ancient Greece, was the first mechanical calculator utilizing gears of various sizes and configuration to perform calculations,[3] which tracked the metonic cycle still used in lunar-to-solar calendars, and which is consistent for calculating the dates of the Olympiads.[4] The Kurdish medieval scientist Al-Jazari built programmable Automata in 1206 AD. One system employed in these devices was the use of pegs and cams placed into a wooden drum at specific locations, which would sequentially trigger levers that in turn operated percussion instruments. The output of this device was a small drummer playing various rhythms and drum patterns.[5][6] The Jacquard Loom, which Joseph Marie Jacquard developed in 1801, uses a series of pasteboard cards with holes punched in them. The hole pattern represented the pattern that the loom had to follow in weaving cloth. The loom could produce entirely different weaves using different sets of cards. Charles Babbage adopted the use of punched cards around 1830 to control his Analytical Engine. The first computer program was written for the Analytical Engine by mathematician Ada Lovelace to calculate a sequence of Bernoulli numbers.[7] The synthesis of numerical calculation, predetermined operation and output, along with a way to organize and input instructions in a manner relatively easy for humans to conceive and produce, led to the modern development of computer programming. Development of computer programming accelerated through the Industrial Revolution.

Data and instructions were once stored on external punched cards, which were kept in order and arranged in program decks.

In the 1880s, Herman Hollerith invented the recording of data on a medium that could then be read by a machine. Prior uses of machine readable media, above, had been for lists of instructions (not data) to drive programmed machines such as Jacquard looms and mechanized musical instruments. "After some initial trials with paper tape, he settled on punched cards..."[8] To process these punched cards, first known as "Hollerith cards" he invented the keypunch, sorter, and tabulator unit record machines.[9] These inventions were the foundation of the data processing industry. In 1896 he founded the Tabulating Machine Company (which later became the core of IBM). The addition of a control panel (plugboard) to his 1906 Type I Tabulator allowed it to do different jobs without having to be physically rebuilt. By the late 1940s, there were several unit record calculators, such as the IBM 602 and IBM 604, whose control panels specified a sequence (list) of operations and thus were programmable machines.

The invention of the von Neumann architecture allowed computer programs to be stored in computer memory. Early programs had to be painstakingly crafted using the instructions (elementary operations) of the particular machine, often in binary notation. Every model of computer would likely use different instructions (machine language) to do the same task. Later, assembly languages were developed that let the programmer specify each instruction in a text format, entering abbreviations for each operation code instead of a number and specifying addresses in symbolic form (e.g., ADD X, TOTAL). Entering a program in assembly language is usually more convenient, faster, and less prone to human error than using machine language, but because an assembly language is little more than a different notation for a machine language, any two machines with different instruction sets also have different assembly languages.

Some of the earliest computer programmers were women during World War II. According to Dr. Sadie Plant, programming is essentially feminine-not simply because women, from Ada Lovelace to Grace Hopper, were the first programmers, but because of the historical and theoretical ties between programming and what Freud called the quintessentially feminine invention of weaving, between female sexuality as mimicry and the mimicry grounding Turing's vision of computers as universal machines. Women, Plant argues, have not merely had a minor part to play in the emergence of digital machines...Theirs is not a subsidiary role which needs to be rescued for posterity, a small supplement whose inclusion would set the existing records straight...Hardware, software, wetware-before their beginnings and beyond their ends, women have been the simulators, assemblers, and programmers of the digital machines.[10]

In 1954, FORTRAN was invented; it was the first high level programming language to have a functional implementation, as opposed to just a design on paper.[11][12] (A high-level language is, in very general terms, any programming language that allows the programmer to write programs in terms that are more abstract than assembly language instructions, i.e. at a level of abstraction "higher" than that of an assembly language.) It allowed programmers to specify calculations by entering a formula directly (e.g. Y = X*2 + 5*X + 9). The program text, or source, is converted into machine instructions using a special program called a compiler, which translates the FORTRAN program into machine language. In fact, the name FORTRAN stands for "Formula Translation". Many other languages were developed, including some for commercial programming, such as COBOL. Programs were mostly still entered using punched cards or paper tape. (See computer programming in the punch card era). By the late 1960s, data storage devices and computer terminals became inexpensive enough that programs could be created by typing directly into the computers. Text editors were developed that allowed changes and corrections to be made much more easily than with punched cards. (Usually, an error in punching a card meant that the card had to be discarded and a new one punched to replace it.)

Modern programming languages like C++ are exponentially more powerful than their predecessors.

As time has progressed, computers have made giant leaps in the area of processing power. This has brought about newer programming languages that are more abstracted from the underlying hardware. Popular programming languages of the modern era include ActionScript, C++, C#, Haskell, HTML with PHP, Java, JavaScript, Objective-C, Perl, Python, Ruby, Smalltalk, SQL, Visual Basic, and dozens more.[13] Although these high-level languages usually incur greater overhead, the increase in speed of modern computers has made the use of these languages much more practical than in the past. These increasingly abstracted languages typically are easier to learn and allow the programmer to develop applications much more efficiently and with less source code. However, high-level languages are still impractical for a few programs, such as those where low-level hardware control is necessary or where maximum processing speed is vital. Computer programming has become a popular career in the developed world, particularly in the United States, Europe, and Japan. Due to the high labor cost of programmers in these countries, some forms of programming have been increasingly subject to offshore outsourcing (importing software and services from other countries, usually at a lower wage), making programming career decisions in developed countries more complicated, while increasing economic opportunities for programmers in less developed areas, particularly China and India.

Modern programming

Quality requirements

Whatever the approach to software development may be, the final program must satisfy some fundamental properties. The following properties are among the most relevant:
  • Reliability: how often the results of a program are correct. This depends on conceptual correctness of algorithms, and minimization of programming mistakes, such as mistakes in resource management (e.g., buffer overflows and race conditions) and logic errors (such as division by zero or off-by-one errors).
  • Robustness: how well a program anticipates problems not due to programmer error. This includes situations such as incorrect, inappropriate or corrupt data, unavailability of needed resources such as memory, operating system services and network connections, and user error.
  • Usability: the ergonomics of a program: the ease with which a person can use the program for its intended purpose, or in some cases even unanticipated purposes. Such issues can make or break its success even regardless of other issues. This involves a wide range of textual, graphical and sometimes hardware elements that improve the clarity, intuitiveness, cohesiveness and completeness of a program's user interface.
  • Portability: the range of computer hardware and operating system platforms on which the source code of a program can be compiled/interpreted and run. This depends on differences in the programming facilities provided by the different platforms, including hardware and operating system resources, expected behaviour of the hardware and operating system, and availability of platform specific compilers (and sometimes libraries) for the language of the source code.
  • Maintainability: the ease with which a program can be modified by its present or future developers in order to make improvements or customizations, fix bugs and security holes, or adapt it to new environments. Good practices during initial development make the difference in this regard. This quality may not be directly apparent to the end user but it can significantly affect the fate of a program over the long term.
  • Efficiency/performance: the amount of system resources a program consumes (processor time, memory space, slow devices such as disks, network bandwidth and to some extent even user interaction): the less, the better. This also includes correct disposal of some resources, such as cleaning up temporary files and lack of memory leaks.

Readability of source code

In computer programming, readability refers to the ease with which a human reader can comprehend the purpose, control flow, and operation of source code. It affects the aspects of quality above, including portability, usability and most importantly maintainability.

Readability is important because programmers spend the majority of their time reading, trying to understand and modifying existing source code, rather than writing new source code. Unreadable code often leads to bugs, inefficiencies, and duplicated code. A study[14] found that a few simple readability transformations made code shorter and drastically reduced the time to understand it.

Following a consistent programming style often helps readability. However, readability is more than just programming style. Many factors, having little or nothing to do with the ability of the computer to efficiently compile and execute the code, contribute to readability.[15] Some of these factors include:
Various visual programming languages have also been developed with the intent to resolve readability concerns by adopting non-traditional approaches to code structure and display.

Algorithmic complexity

The academic field and the engineering practice of computer programming are both largely concerned with discovering and implementing the most efficient algorithms for a given class of problem. For this purpose, algorithms are classified into orders using so-called Big O notation, which expresses resource use, such as execution time or memory consumption, in terms of the size of an input. Expert programmers are familiar with a variety of well-established algorithms and their respective complexities and use this knowledge to choose algorithms that are best suited to the circumstances.

Methodologies

The first step in most formal software development processes is requirements analysis, followed by testing to determine value modeling, implementation, and failure elimination (debugging). There exist a lot of differing approaches for each of those tasks. One approach popular for requirements analysis is Use Case analysis. Nowadays many programmers use forms of Agile software development where the various stages of formal software development are more integrated together into short cycles that take a few weeks rather than years. There are many approaches to the Software development process.
Popular modeling techniques include Object-Oriented Analysis and Design (OOAD) and Model-Driven Architecture (MDA). The Unified Modeling Language (UML) is a notation used for both the OOAD and MDA.

A similar technique used for database design is Entity-Relationship Modeling (ER Modeling).
Implementation techniques include imperative languages (object-oriented or procedural), functional languages, and logic languages.

Measuring language usage

It is very difficult to determine what are the most popular of modern programming languages. Some languages are very popular for particular kinds of applications (e.g., COBOL is still strong in the corporate data center[citation needed], often on large mainframes, FORTRAN in engineering applications, scripting languages in Web development, and C in embedded applications), while some languages are regularly used to write many different kinds of applications. Also many applications use a mix of several languages in their construction and use. New languages are generally designed around the syntax of a previous language with new functionality added (for example C++ adds object-orientedness to C, and Java adds memory management and bytecode to C++).

Methods of measuring programming language popularity include: counting the number of job advertisements that mention the language,[16] the number of books sold and courses teaching the language (this overestimates the importance of newer languages), and estimates of the number of existing lines of code written in the language (this underestimates the number of users of business languages such as COBOL).

Debugging

The bug from 1947 which is at the origin of a popular (but incorrect) etymology for the common term for a software defect.

Debugging, is a very important task in the software development process since having defects in a program can have significant consequences for its users. Some languages are more prone to some kinds of faults because their specification does not require compilers to perform as much checking as other languages. Use of a static code analysis tool can help detect some possible problems.

Debugging is often done with IDEs like Eclipse, Kdevelop, NetBeans, Code::Blocks, and Visual Studio. Standalone debuggers like gdb are also used, and these often provide less of a visual environment, usually using a command line.

Programming languages

Different programming languages support different styles of programming (called programming paradigms). The choice of language used is subject to many considerations, such as company policy, suitability to task, availability of third-party packages, or individual preference. Ideally, the programming language best suited for the task at hand will be selected. Trade-offs from this ideal involve finding enough programmers who know the language to build a team, the availability of compilers for that language, and the efficiency with which programs written in a given language execute. Languages form an approximate spectrum from "low-level" to "high-level"; "low-level" languages are typically more machine-oriented and faster to execute, whereas "high-level" languages are more abstract and easier to use but execute less quickly. It is usually easier to code in "high-level" languages than in "low-level" ones.

Source: Wikipedia