Portal:Computer programming
The Computer Programming Portal
Computer programming or coding is the composition of sequences of instructions, called programs, that computers can follow to perform tasks. It involves designing and implementing algorithms, step-by-step specifications of procedures, by writing code in one or more programming languages. Programmers typically use high-level programming languages that are more easily intelligible to humans than machine code, which is directly executed by the central processing unit. Proficient programming usually requires expertise in several different subjects, including knowledge of the application domain, details of programming languages and generic code libraries, specialized algorithms, and formal logic.
Auxiliary tasks accompanying and related to programming include analyzing requirements, testing, debugging (investigating and fixing problems), implementation of build systems, and management of derived artifacts, such as programs' machine code. While these are sometimes considered programming, often the term software development is used for this larger overall process – with the terms programming, implementation, and coding reserved for the writing and editing of code per se. Sometimes software development is known as software engineering, especially when it employs formal methods or follows an engineering design process. (Full article...)
Selected articles - load new batch
-
Image 1
C# (pronounced See sharp) is a general-purpose high-level programming language supporting multiple paradigms. C# encompasses static typing, strong typing, lexically scoped, imperative, declarative, functional, generic, object-oriented (class-based), and component-oriented programming disciplines.
The C# programming language was designed by Anders Hejlsberg from Microsoft in 2000 and was later approved as an international standard by Ecma (ECMA-334) in 2002 and ISO/IEC (ISO/IEC 23270) in 2003. Microsoft introduced C# along with .NET Framework and Visual Studio, both of which were closed-source. At the time, Microsoft had no open-source products. Four years later, in 2004, a free and open-source project called Mono began, providing a cross-platform compiler and runtime environment for the C# programming language. A decade later, Microsoft released Visual Studio Code (code editor), Roslyn (compiler), and the unified .NET platform (software framework), all of which support C# and are free, open-source, and cross-platform. Mono also joined Microsoft but was not merged into .NET.
the most recent stable version of the language is C# 11.0, which was released in 2022 in .NET 7.0. (Full article...) -
Image 2
Portion of the calculating machine with a printing mechanism of the analytical engine, built by Charles Babbage, as displayed at the Science Museum (London)
The analytical engine was a proposed mechanical general-purpose computer designed by English mathematician and computer pioneer Charles Babbage. It was first described in 1837 as the successor to Babbage's difference engine, which was a design for a simpler mechanical calculator.
The analytical engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. In other words, the structure of the analytical engine was essentially the same as that which has dominated computer design in the electronic era. The analytical engine is one of the most successful achievements of Charles Babbage.
Babbage was never able to complete construction of any of his machines due to conflicts with his chief engineer and inadequate funding. It was not until 1941 that Konrad Zuse built the first general-purpose computer, Z3, more than a century after Babbage had proposed the pioneering analytical engine in 1837. (Full article...) -
Image 3
A Blender screenshot displaying the 3D test model Suzanne
Computer graphics deals with generating images and art with the aid of computers. Today, computer graphics is a core technology in digital photography, film, video games, digital art, cell phone and computer displays, and many specialized applications. A great deal of specialized hardware and software has been developed, with the displays of most devices being driven by computer graphics hardware. It is a vast and recently developed area of computer science. The phrase was coined in 1960 by computer graphics researchers Verne Hudson and William Fetter of Boeing. It is often abbreviated as CG, or typically in the context of film as computer generated imagery (CGI). The non-artistic aspects of computer graphics are the subject of computer science research.
Some topics in computer graphics include user interface design, sprite graphics, rendering, ray tracing, geometry processing, computer animation, vector graphics, 3D modeling, shaders, GPU design, implicit surfaces, visualization, scientific computing, image processing, computational photography, scientific visualization, computational geometry and computer vision, among others. The overall methodology depends heavily on the underlying sciences of geometry, optics, physics, and perception.
Computer graphics is responsible for displaying art and image data effectively and meaningfully to the consumer. It is also used for processing image data received from the physical world, such as photo and video content. Computer graphics development has had a significant impact on many types of media and has revolutionized animation, movies, advertising, and video games, in general. (Full article...) -
Image 4The history of artificial intelligence (AI) began in antiquity, with myths, stories and rumors of artificial beings endowed with intelligence or consciousness by master craftsmen. The seeds of modern AI were planted by philosophers who attempted to describe the process of human thinking as the mechanical manipulation of symbols. This work culminated in the invention of the programmable digital computer in the 1940s, a machine based on the abstract essence of mathematical reasoning. This device and the ideas behind it inspired a handful of scientists to begin seriously discussing the possibility of building an electronic brain.
The field of AI research was founded at a workshop held on the campus of Dartmouth College, USA during the summer of 1956. Those who attended would become the leaders of AI research for decades. Many of them predicted that a machine as intelligent as a human being would exist in no more than a generation, and they were given millions of dollars to make this vision come true.
Eventually, it became obvious that commercial developers and researchers had grossly underestimated the difficulty of the project. In 1974, in response to the criticism from James Lighthill and ongoing pressure from congress, the U.S. and British Governments stopped funding undirected research into artificial intelligence, and the difficult years that followed would later be known as an "AI winter". Seven years later, a visionary initiative by the Japanese Government inspired governments and industry to provide AI with billions of dollars, but by the late 1980s the investors became disillusioned and withdrew funding again. (Full article...) -
Image 5Structured Query Language (SQL) (/ˌɛsˌkjuːˈɛl/ ⓘ S-Q-L, sometimes /ˈsiːkwəl/ "sequel" for historical reasons) is a domain-specific language used in programming and designed for managing data held in a relational database management system (RDBMS), or for stream processing in a relational data stream management system (RDSMS). It is particularly useful in handling structured data, i.e., data incorporating relations among entities and variables.
Introduced in the 1970s, SQL offered two main advantages over older read–write APIs such as ISAM or VSAM. Firstly, it introduced the concept of accessing many records with one single command. Secondly, it eliminates the need to specify how to reach a record, i.e., with or without an index.
Originally based upon relational algebra and tuple relational calculus, SQL consists of many types of statements, which may be informally classed as sublanguages, commonly: a data query language (DQL), a data definition language (DDL), a data control language (DCL), and a data manipulation language (DML). The scope of SQL includes data query, data manipulation (insert, update, and delete), data definition (schema creation and modification), and data access control. Although SQL is essentially a declarative language (4GL), it also includes procedural elements. (Full article...) -
Image 6Laboratory Virtual Instrument Engineering Workbench (LabVIEW) is a system-design platform and development environment for a visual programming language from National Instruments.
The graphical language is named "G"; not to be confused with G-code. The G dataflow language was originally developed by LabVIEW. LabVIEW is commonly used for data acquisition, instrument control, and industrial automation on a variety of operating systems (OSs), including macOS and other versions of Unix and Linux, as well as Microsoft Windows.
The latest versions of LabVIEW are LabVIEW 2023 Q1 (released in April 2023) and LabVIEW NXG 5.1 (released in January 2021). NI released the free for non-commercial use LabVIEW and LabVIEW NXG Community editions on April 28, 2020. (Full article...) -
Image 7
In computer programming, assembly language (alternatively assembler language or symbolic machine code), often referred to simply as assembly and commonly abbreviated as ASM or asm, is any low-level programming language with a very strong correspondence between the instructions in the language and the architecture's machine code instructions. Assembly language usually has one statement per machine instruction (1:1), but constants, comments, assembler directives, symbolic labels of, e.g., memory locations, registers, and macros are generally also supported.
The first assembly code in which a language is used to represent machine code instructions is found in Kathleen and Andrew Donald Booth's 1947 work, Coding for A.R.C.. Assembly code is converted into executable machine code by a utility program referred to as an assembler. The term "assembler" is generally attributed to Wilkes, Wheeler and Gill in their 1951 book The Preparation of Programs for an Electronic Digital Computer, who, however, used the term to mean "a program that assembles another program consisting of several sections into a single program". The conversion process is referred to as assembly, as in assembling the source code. The computational step when an assembler is processing a program is called assembly time.
Because assembly depends on the machine code instructions, each assembly language is specific to a particular computer architecture. (Full article...) -
Image 8
Computer class at Chkalovski Village School No. 2 in 1985–1986
The history of computing in the Soviet Union began in the late 1940s, when the country began to develop its Small Electronic Calculating Machine (MESM) at the Kiev Institute of Electrotechnology in Feofaniya. Initial ideological opposition to cybernetics in the Soviet Union was overcome by a Khrushchev era policy that encouraged computer production.
By the early 1970s, the uncoordinated work of competing government ministries had left the Soviet computer industry in disarray. Due to lack of common standards for peripherals and lack of digital storage capacity the Soviet Union's technology significantly lagged behind the West's semiconductor industry. The Soviet government decided to abandon development of original computer designs and encouraged cloning of existing Western systems (e.g. the 1801 CPU series was scrapped in favor of the PDP-11 ISA by the early 1980s).
Soviet industry was unable to mass-produce computers to acceptable quality standards and locally manufactured copies of Western hardware were unreliable. As personal computers spread to industries and offices in the West, the Soviet Union's technological lag increased. (Full article...) -
Image 9Eiffel is an object-oriented programming language designed by Bertrand Meyer (an object-orientation proponent and author of Object-Oriented Software Construction) and Eiffel Software. Meyer conceived the language in 1985 with the goal of increasing the reliability of commercial software development; the first version becoming available in 1986. In 2005, Eiffel became an ISO-standardized language.
The design of the language is closely connected with the Eiffel programming method. Both are based on a set of principles, including design by contract, command–query separation, the uniform-access principle, the single-choice principle, the open–closed principle, and option–operand separation.
Many concepts initially introduced by Eiffel later found their way into Java, C#, and other languages. New language design ideas, particularly through the Ecma/ISO standardization process, continue to be incorporated into the Eiffel language. (Full article...) -
Image 10MATLAB (an abbreviation of "MATrix LABoratory") is a proprietary multi-paradigm programming language and numeric computing environment developed by MathWorks. MATLAB allows matrix manipulations, plotting of functions and data, implementation of algorithms, creation of user interfaces, and interfacing with programs written in other languages.
Although MATLAB is intended primarily for numeric computing, an optional toolbox uses the MuPAD symbolic engine allowing access to symbolic computing abilities. An additional package, Simulink, adds graphical multi-domain simulation and model-based design for dynamic and embedded systems.
, MATLAB has more than 4 million users worldwide. They come from various backgrounds of engineering, science, and economics. , more than 5000 global colleges and universities use MATLAB to support instruction and research. (Full article...) -
Image 11Cover graphic of The C Programming Language book.
C (pronounced /ˈsiː/ – like the letter c) is a general-purpose computer programming language. It was created in the 1970s by Dennis Ritchie, and remains very widely used and influential. By design, C's features cleanly reflect the capabilities of the targeted CPUs. It has found lasting use in operating systems, device drivers, and protocol stacks, but its use in application software has been decreasing. C is commonly used on computer architectures that range from the largest supercomputers to the smallest microcontrollers and embedded systems.
A successor to the programming language B, C was originally developed at Bell Labs by Ritchie between 1972 and 1973 to construct utilities running on Unix. It was applied to re-implementing the kernel of the Unix operating system. During the 1980s, C gradually gained popularity. It has become one of the most widely used programming languages, with C compilers available for practically all modern computer architectures and operating systems. The book The C Programming Language, co-authored by the original language designer, served for many years as the de facto standard for the language. C has been standardized since 1989 by the American National Standards Institute (ANSI) and the International Organization for Standardization (ISO).
C is an imperative procedural language, supporting structured programming, lexical variable scope, and recursion, with a static type system. It was designed to be compiled to provide low-level access to memory and language constructs that map efficiently to machine instructions, all with minimal runtime support. Despite its low-level capabilities, the language was designed to encourage cross-platform programming. A standards-compliant C program written with portability in mind can be compiled for a wide variety of computer platforms and operating systems with few changes to its source code. (Full article...) -
Image 12Allen at the Flying Heritage Collection in 2013
Paul Gardner Allen (January 21, 1953 – October 15, 2018) was an American business magnate, computer programmer, researcher, investor, film producer, and philanthropist. He is best known for co-founding Microsoft Corporation with his childhood friend Bill Gates in 1975, which helped spark the microcomputer revolution of the 1970s and 1980s. Microsoft went on to become the world's largest personal computer software company. Allen was ranked as the 44th-wealthiest person in the world by Forbes in 2018, with an estimated net worth of $20.3 billion at the time of his death.
Allen quit from day-to-day work at Microsoft in early 1983 after a Hodgkin lymphoma diagnosis, remaining on its board as vice-chairman. He and his sister, Jody Allen, founded Vulcan Inc. in 1986, a privately held company that managed his business and philanthropic efforts. He had a multi-billion dollar investment portfolio, including technology and media companies, scientific research, real estate holdings, private space flight ventures, and stakes in other sectors. He owned the Seattle Seahawks of the National Football League and the Portland Trail Blazers of the National Basketball Association, and was part-owner of the Seattle Sounders FC of Major League Soccer. In 2000 he resigned from his position on Microsoft's board and assumed the post of senior strategy advisor to the company's management team.
Allen founded the Allen Institutes for Brain Science, Artificial Intelligence, and Cell Science, as well as companies like Stratolaunch Systems and Apex Learning. He gave more than $2 billion to causes such as education, wildlife and environmental conservation, the arts, healthcare, and community services. In 2004, he funded the first crewed private spaceplane with SpaceShipOne. He received numerous awards and honors, and was listed among the Time 100 Most Influential People in the World in 2007 and 2008. (Full article...) -
Image 13
The Antikythera mechanism (/ˌæntɪˈkɪθɪərə/ AN-tih-KIH-ther-ə) is an Ancient Greek hand-powered orrery (model of the Solar System), described as the oldest known example of an analogue computer used to predict astronomical positions and eclipses decades in advance. It could also be used to track the four-year cycle of athletic games which was similar to an Olympiad, the cycle of the ancient Olympic Games.
This artefact was among wreckage retrieved from a shipwreck off the coast of the Greek island Antikythera in 1901. In 1902, it was identified by archaeologist Valerios Stais as containing a gear. The device, housed in the remains of a wooden-framed case of (uncertain) overall size 34 cm × 18 cm × 9 cm (13.4 in × 7.1 in × 3.5 in), was found as one lump, later separated into three main fragments which are now divided into 82 separate fragments after conservation efforts. Four of these fragments contain gears, while inscriptions are found on many others. The largest gear is about 13 cm (5 in) in diameter and originally had 223 teeth.
In 2008, a team from Cardiff University used computer x-ray tomography and high resolution scanning to image inside fragments of the crust-encased mechanism and read the faintest inscriptions that once covered the outer casing. This suggests it had 37 meshing bronze gears enabling it to follow the movements of the Moon and the Sun through the zodiac, to predict eclipses and to model the irregular orbit of the Moon, where the Moon's velocity is higher in its perigee than in its apogee. This motion was studied in the 2nd century BC by astronomer Hipparchus of Rhodes, and he may have been consulted in the machine's construction. There is speculation that a portion of the mechanism is missing and it calculated the positions of the five classical planets. The inscriptions were further deciphered in 2016, revealing numbers connected with the synodic cycles of Venus and Saturn. (Full article...) -
Image 14
Stephen Gary Wozniak (/ˈwɒzniæk/; born August 11, 1950), also known by his nickname "Woz", is an American technology entrepreneur, electronics engineer, computer scientist, computer programmer, philanthropist, and inventor. In 1976, he co-founded Apple Computer with his late business partner Steve Jobs, which later became the world's largest technology company by revenue and the largest company in the world by market capitalization. Through his work at Apple in the 1970s and 1980s, he is widely recognized as one of the most prominent pioneers of the personal computer revolution.
In 1975, Wozniak started developing the Apple I into the computer that launched Apple when he and Jobs first began marketing it the following year. He primarily designed the Apple II, introduced in 1977, known as one of the first highly successful mass-produced microcomputers, while Jobs oversaw the development of its foam-molded plastic case and early Apple employee Rod Holt developed its switching power supply. With human–computer interface expert Jef Raskin, Wozniak had a major influence over the initial development of the original Apple Macintosh concepts from 1979 to 1981, when Jobs took over the project following Wozniak's brief departure from the company due to a traumatic airplane accident. After permanently leaving Apple in 1985, Wozniak founded CL 9 and created the first programmable universal remote, released in 1987. He then pursued several other businesses and philanthropic ventures throughout his career, focusing largely on technology in K–12 schools.
As of February 2020, Wozniak has remained an employee of Apple in a ceremonial capacity since stepping down in 1985. In recent years, he has helped fund multiple entrepreneurial efforts dealing in areas such as GPS and telecommunications, flash memory, technology and pop culture conventions, technical education, ecology, satellites and more. (Full article...) -
Image 15Computer programming or coding is the composition of sequences of instructions, called programs, that computers can follow to perform tasks. It involves designing and implementing algorithms, step-by-step specifications of procedures, by writing code in one or more programming languages. Programmers typically use high-level programming languages that are more easily intelligible to humans than machine code, which is directly executed by the central processing unit. Proficient programming usually requires expertise in several different subjects, including knowledge of the application domain, details of programming languages and generic code libraries, specialized algorithms, and formal logic.
Auxiliary tasks accompanying and related to programming include analyzing requirements, testing, debugging (investigating and fixing problems), implementation of build systems, and management of derived artifacts, such as programs' machine code. While these are sometimes considered programming, often the term software development is used for this larger overall process – with the terms programming, implementation, and coding reserved for the writing and editing of code per se. Sometimes software development is known as software engineering, especially when it employs formal methods or follows an engineering design process. (Full article...)
Selected images
-
Image 1A lone house. An image made using Blender 3D.
-
Image 2Partial map of the Internet based on the January 15, 2005 data found on opte.org. Each line is drawn between two nodes, representing two IP addresses. The length of the lines are indicative of the delay between those two nodes. This graph represents less than 30% of the Class C networks reachable by the data collection program in early 2005.
-
Image 3Partial view of the Mandelbrot set. Step 1 of a zoom sequence: Gap between the "head" and the "body" also called the "seahorse valley".
-
Image 4Animation of a Non-uniform rational B-spline surface. Modeled and rendered in Cobalt.
-
Image 5A view of the GNU nano Text editor version 6.0
-
Image 6GNOME Shell, GNOME Clocks, Evince, gThumb and GNOME Files at version 3.30, in a dark theme
-
Image 9A head crash on a modern hard disk drive
-
Image 10Stephen Wolfram is a British-American computer scientist, physicist, and businessman. He is known for his work in computer science, mathematics, and in theoretical physics.
-
Image 11Ada Lovelace was an English mathematician and writer, chiefly known for her work on Charles Babbage's proposed mechanical general-purpose computer, the Analytical Engine. She was the first to recognize that the machine had applications beyond pure calculation, and to have published the first algorithm intended to be carried out by such a machine. As a result, she is often regarded as the first computer programmer.
-
Image 12Margaret Hamilton standing next to the navigation software that she and her MIT team produced for the Apollo Project.
-
Image 13Output from a (linearised) shallow water equation model of water in a bathtub. The water experiences 5 splashes which generate surface gravity waves that propagate away from the splash locations and reflect off of the bathtub walls.
-
Image 14Bill Gosper's Glider Gun in action
-
Image 15Grace Hopper at the UNIVAC keyboard, c. 1960. Grace Brewster Murray: American mathematician and rear admiral in the U.S. Navy who was a pioneer in developing computer technology, helping to devise UNIVAC I. the first commercial electronic computer, and naval applications for COBOL (common-business-oriented language).
-
Image 16This image (when viewed in full size, 1000 pixels wide) contains 1 million pixels, each of a different color.
-
Image 17Deep Blue was a chess-playing expert system run on a unique purpose-built IBM supercomputer. It was the first computer to win a game, and the first to win a match, against a reigning world champion under regular time controls. Photo taken at the Computer History Museum.
-
Image 18An IBM Port-A-Punch punched card
Did you know? - load more entries

- ... that computer science professor Ruth Aylett performed with a robot poet in the Edinburgh Free Fringe?
- ... that Phil Fletcher as Hacker T. Dog caused Lauren Layfield to make the "most famous snort" in the United Kingdom in 2016?
- ... that the study of selection algorithms has been traced to an 1883 work of Lewis Carroll on how to award second place in single-elimination tournaments?
- ... that Guy Parmelin, now President of Switzerland, opened the study program of cyber security of the Lucerne School of Information Technology in 2018?
- ... that Rust has been named the "most loved programming language" every year for seven years since 2016 by annual surveys conducted by Stack Overflow?
- ... that Brazilian computer science researcher and internet pioneer Tadao Takahashi negotiated with drug lords to install internet equipment in his country?
Subcategories
WikiProjects
- There are many users interested in computer programming, join them.
Computer programming news
No recent news
Topics
Fields | |||||||
---|---|---|---|---|---|---|---|
Concepts |
| ||||||
Orientations | |||||||
Models |
| ||||||
Related fields | |||||||
Related portals
Associated Wikimedia
The following Wikimedia Foundation sister projects provide more on this subject:
-
Commons
Free media repository -
Wikibooks
Free textbooks and manuals -
Wikidata
Free knowledge base -
Wikinews
Free-content news -
Wikiquote
Collection of quotations -
Wikisource
Free-content library -
Wikiversity
Free learning tools -
Wiktionary
Dictionary and thesaurus
-
Random portal Random portal