Français

Back Office

This glossary compiles all the definitions included in all issues of Back Office.

A

Algorithm
API
Application
Artificial intelligence
Assembler

B

Back End
Big Data
Bot

C

Cloud
CMS
Compiler
Compositing
Computability
Computer Science
CSS
CSS Print
Cyberculture
Cybernetic

D

D3.js
Data
Data Set
Database
Deep Learning
Desktop Metaphor
Desktop Publishing
Digital
DOM

E

Encoding
EPS

F

Feedback
Flash
Fork
Free License
Free Software
Function

G

G-Code
GitHub
GNU
GPL
Graffiti
Graphical User Interface (GUI)

H

Hacker
HP-GL
HTML

I

Information
Inkscape
Interface
Internet

J

Java
JavaScript

L

Library (Software)
Linux

M

Machine Learning
Mainframe
Markdown
Markup Language
Media
Metadata
Metamedium
MIT Media Lab
Multimedia

N

Nerd

O

Object-Oriented Programming
OCR
Open data
Open source
OpenType
Operating System

P

PC
PHP
Pipe
Plain Text
Plug-in
Processing
Program
Programming
Programming Language
Python

R

Random
Regular Expression
Rich Text

S

Scribus
Script
Shell
Shortcode
Software
Source Code
Standard
Streaming
SVG

T

TED
Teletype
Tracker
Turing Machine

U

Unix
URL
UX

V

Vector
Vox-ATypI

W

Web
Web Browser
Web Server
Web Tracking
Webapp
Webfont
WebGL
Wiki
WIMP
WYSIWYG

X

Xerox PARC
XML
  • Algorithm

    A linear sequence of homogenous instructions that performs operations that transform input data into an expected output. Algorithms must be translated into computing language to be understood by machines. 

  • API

    Application Programming Interface: APIs are intended to facilitate a programmer’s work by providing a set of functions, protocols and tools for building or developing software (software applications, websites, etc.).

  • Application

    An application (“app”) or application software is a type of software that enables its users to do specific tasks (text writing, music composition, Web browsing, etc.

  • Artificial intelligence

    A set of theories and techniques used to produce machines capable of simulating or realizing capacities requiring human intelligence (perception, memory, emotion, critical thinking, etc.). The term was introduced in the article “Computing Machinery and Intelligence” published in 1950 by Alan Turing.

  • Assembler

    An assembler language is, in the computer science field, a low-level programming language whose instructions remain readable by a human being (contrary to binary code for instance). This type of programming language is very close to machine hardware architecture and allows direct interactions with computer resources. Assembler languages are often used for developing programs that require high computing performance. 

  • Back End

    A back end is a technical infrastructure, usually secured with a password, used to administer a client interface (front end).

  • Big Data

    Big data refers to data sets that have become so voluminous that they exceed the storage, analysis or visualization capabilities of conventional computing tools. This term appeared in the mid-2000s following the increase of physical sensors and website tracking tools.

  • Bot

    This word, derived from “robot,” most often refers to a stand-alone software using algorithms designed to mimic the behavior of a human user (i.e. chatbots, interactive agents used for online chats).

  • Cloud

    Cloud computing means the use of computing resources on distant machines (servers) that are, most of the time, for rent or provided in exchange for pay-per-use charges.

  • CMS

    A CMS (Content Management System) is a program designed to enter and read data for a website. It is often deployed on a host server and allows administrators to enter content through a private interface (back end). Most of the time, data is stored in a database and “served” to the client’s browser.

  • Compiler

    A compiler is a program that turns source code written in a programming language, readable by a human being, into binary code that can be executed (read) by a machine.

  • Compositing

    Compositing is a set of digital techniques, generally used for film post-production, which allows several images or media to be mixed into one shot or one item.

  • Computability

    The computability of a mathematical function defines its ability to be calculated by a Turing machine. A computable function can thus be translated into a computer program, executable by any compatible system.

  • Computer Science

    Computer science is the theoretical and practical study of the design and use of computers. It is the scientific approach to the computation and the systematic study of the procedures that underlie the acquisition, representation, processing, storage, and communication of data.

  • CSS

    Cascading Style Sheets is a style sheet language used to describe the layout of a Web document (usually a webpage).

  • CSS Print

    CSS Print is a set of instructions written in the CSS language used to manage the printed aspects of a webpage, and making it possible to use Web techniques as a multi-support publication and layout environment.

  • Cyberculture

    The word cyberculture appeared in the 1990s. It’s an effort to merge the concepts of cybernetics (scientific study of control), cyberpunk (dystopian sci-fi) and cyberspace (design, browsing and relationship methods allowed by digital technologies). Cyberculture implies both a set of cultural productions and a new approach to culture.

  • Cybernetic

    Cybernetics (from the Greek kubernêtês: pilot, governor) is a term proposed in 1947 by mathematician Norbert Wiener to define a science concerning the control of natural or artificial systems, based on computers. It incorporates notions such as balance (and entropy), systems (piloted), black box, feedback and information (signal theory, input/output relationships).

  • D3.js

    D3.js (or D3 for Data-Driven Documents) is a JavaScript library for the visual representation of data. Developed in 2010 by Mike Bostock, it is now the most common data visualization tool on the Web.

  • Data

    Data is an elementary description of a reality that can be evaluated according to a reference system. This may include, for example, the result of an observation or a measurement. Raw data generally needs to be analyzed and organized in order to derive meaning and therefore information

  • Data Set

    A set of digital data associated with a particular observation, the values of which are expressed in a coherent system.

  • Database

    In computer science, a database is a program for storing and accessing data, usually through a dedicated language such as SQL. There are several types of database structures: navigational (data is organized according to a network of fixed links that can be read one by one), hierarchical (data is organized according to a pyramid scheme and can be read from its “master” occurrence), relational (data is organized according to a table model with several previously defined columns whose inner values or whole set can be used to sort the results of a query) or “NoSQL” (data is stored in buckets without any defined structure and is indexed with a unique identifier.)

  • Deep Learning

    Deep learning refers to machine learning techniques based on previously defined large data sets that are designed to “train” algorithms, i.e. to enable them to define their own analysis parameters. Since the 2000s, the development of these methods has led to significant advances in the field of signal processing, including facial recognition, speech recognition, computer vision and automatic language processing.

  • Desktop Metaphor

    Conceptualized with the release of the Xerox Star computer in 1981, the desktop metaphor is a set of items in a virtual graphic interface which depict, through icons, the typical objects of a desktop (trash, folders, files etc.). These icons can represent actions executed by the OS (delete, copy, paste, etc.) as well as stored data (files).

  • Desktop Publishing

    Appeared in the 1980s; Desktop Publishing (or DTP) refers to the activity of designing printed documents on a personal computer with dedicated software (including page layout, type-setting, photo editing, etc.)

  • Digital

    In the field of computer science, the word digital refers to a virtual object (number, text, image, data, program, etc.) encoded as a binary number, namely a series of bits in 0 or 1 state. More generally, this term is used to define the set of contents and activities related to computer use.

  • DOM

    A DOM or Document Object Model is the calculated structure of a webpage. Web browsers generally include a DOM Inspector, which is a development tool used to observe and edit the DOM during page consultation.

  • Encoding

    Encoding is the process of transcribing a text message or data according to specific conventions. Unicode (1987) is, for example, a standardized character encoding system that assigns a unique numerical identifier to each glyph to ensure that it is correctly displayed regardless of the computer platform or software used.

  • EPS

    The EPS (Encapsulated PostScript) file format describes a type of document that can embed PostScript files which describe printed documents. It is suitable for vector images and is especially used in Illustrator.

  • Feedback

    An action produced in return for an effect on the device that gave rise to it. In the case of human-machine interfaces, interactions are governed by feedback allowing, for example, a user to obtaining confirmation that an input has been carried out by the machine.

  • Flash

    The proprietary Flash technology enables the enhancement of webpages with interactive or animated items, created in the eponymous software. It was released in 1996, at a time when standard Web languages only allowed basic interactions. Displaying Flash objects (SWF) in a Web browser requires the installation of the Flash Player plug-in. This technology has been criticized for causing security and performance issues therefore it is not supported by the majority of mobile devices and is progressively disappearing from desktop Web browsers.

  • Fork

    In computer science, a fork is a new software derived from the source code of an existing software, often released under free license.

  • Free License

    A free license is an agreement which grants the following four freedoms to the user: free use; ability to study how it works; ability to edit and redistribute it to other users, including for commercial purposes. The most famous free license is the GPL license.

  • Free Software

    Free software allows, in both technical and legal terms, its user to run, study, edit and redistribute it, which implies an open-source code. Contrary to proprietary software whose source code is locked, free software promotes the values of sharing and freedom.

  • Function

    In computer science, a function (or procedure) is a piece (object) of a program designed to execute a specific operation and that can be used ad hoc (routine).

  • G-Code

    G-Code is a programming language from the 1960s that is used to define sequences of geometrical or side instructions for driving CNC (Computer Numerical Control) machines.

  • GitHub

    Launched in 2008, GitHub is a source code hosting and management platform that is based on the Git version control protocol and on open source collaborative working methods.

  • GNU

    Initiated by Richard Stallman in 1983, GNU (“GNU’s not Unix”) is a free operating system that uses the concepts and the compatibility of Unix, a proprietary OS originally developed in 1969 at Bell Labs for mainframe systems. The merging of GNU with the Linux kernel created by Linus Torvalds in 1991 gave birth to GNU/Linux, the most used free OS.

  • GPL

    Finalized in 1989 by Richard Stallman, the GPL (GNU General Public License, GNU-GPL) license’s fundamental goal is to establish the legal terms of distribution of GNU and free software. The terms and conditions of GPL allow anyone to edit, study and redistribute a GPL-licensed project, including under a derived form.

  • Graffiti

    Grafitti is a handwriting recognition software developed by the Palm company at the beginning of the 1990s that was used in personal digital assistant devices (PDAs).

  • Graphical User Interface (GUI)

    Invented at the end of the 1960s by the team of American computer scientist Douglas Engelbart at the Stanford Research Institute, and developed at Xerox PARC (Palo Alto Research Center) in the mid-1970s, a graphical user interface (GUI) is a type of interface that allows a user to interact with digital objects through icons, menus and texts represented on a screen (e.g. WIMP). Graphical user interfaces were conceived with the ambition of democratizing access to computers by removing the complexity of the command-line interfaces that were prevalent at the time.

  • Hacker

    A hacker is someone who is able to understand and edit locked computing systems. By extension and contrary to the derogative image of the pirate, it is possible to see in the hacker’s skills and curiosity a contribution to the search for the common good and individual emancipation.

  • HP-GL

    HP-GL (Hewlett-Packard Graphics Language) is a programming language developed at the end of the 1970s designed to drive tracing plotters. It is based on a series of instructions representing machine commands (raise or lower pencil for instance) and geometric coordinates.

  • HTML

    HTML (HyperText Markup Language) is a data description language which is structured with markup and designed to describe the content of webpages. It is one of the three inventions, along with HTTP (HyperText Transfer Protocol) and URL (Uniform Resource Locator) that are the foundations of the Web.

  • Information

    Element of knowledge, which may be formalized through norms to be retained, processed or communicated. In an etymological sense, information is what shapes the mind. It comes from the Latin verb informare, which means “shape, fashion, describe.” 

  • Inkscape

    Inkscape is a vector drawing software released under free license (GPL License) whose first version dates back to 2003. It offers a credible alternative to Illustrator, a proprietary software from Adobe.

  • Interface

    An interface acts as a link between two objects, allowing them to interact according to a set of defined rules. In the field of digital technology, this word could refer to user interfaces that allow users to interact with computers (by using visual representations of virtual objects for instance, see graphical user interface), communication protocols between software and hardware devices (drivers) or between softwares (see API).

  • Internet

    The Internet is a global computer network which is distributed, namely without a central hub. Data is transferred from machine to machine using a series of standard protocols which provide the platform for several services, such as e-mail, peer-to-peer file transfer (BitTorrent) or the Web. It superceded the ARPAnet network created in 1972 which was mainly used by universities and governmental agencies.

  • Java

    Object-oriented programming language created in 1995 by Sun Microsystems for software development. A Java program can easily be transferred from one operating system to another without dedicated compilation, thanks to a virtual machine that directly executes the code.

  • JavaScript

    Object-oriented programming language created in 1995 by Brendan Eich. Mainly used to add interactivity to webpages, today it has broader applications, in particular with Node.js, a local runtime environment.

  • Library (Software)

    Software libraries are pre-established sets of functions designed to extend the possibilities of a programming language.

  • Linux

    Linux (GNU/Linux) is an operating system, like Windows and macOS. It is released under free license (GPL) and is the result of the merging of the GNU OS (developed by Richard Stallman in 1983) with the Linux kernel (created by Linus Torvald in 1991). GNU/Linux is the most emblematic software of hacker culture.

  • Machine Learning

    Machine learning refers to a set of mechanisms designed to adjust the procedures of a program based on the identification of statistical correlations from previous analyses. This type of metod can effectively solve complex problems for conventional algorithms. They are used in particular for pattern recognition, search engine development or financial analysis.

  • Mainframe

    A mainframe computer is a high-power computer dedicated to centralized data processing. Users of mainframes interact with the computer through terminals, as opposed to a network of connected personal computers. Each terminal acts as an interface for sending commands and displaying results. This type of architecture, mainly used in the 1970s, is still running in some large companies (banks) and administrative entities.

  • Markdown

    Markdown is a markup language created in 2004 by American developer John Gruber. Its purpose is to provide a syntax that is easy to read and write to facilitate text formatting.

  • Markup Language

    Markup languages are programming languages used for structuring, describing and enriching textual data. They are characterized by the inclusion of tags (sequences of predetermined glyphs) in the text flow in accordance with a specific syntax.

  • Media

    The word media refers to a means of transmission that enables communication, whether directly (language, writing, etc.) or through technical processes (radio, TV, the Web, etc., this is referred to as “mass media”). According to the media theoretician Friedrich Kittler, “all technical media either stores, transmits, or processes signals and […] the computer (in theory since 1936, in practice since the Second World War) is the only medium that combines these three functions—storage, transmission, and processing—[all] fully automated.”

  • Metadata

    Metadata is data used to define or describe other data. The shooting date of a digital photograph is, for example, an element of the metadata of the latter.

  • Metamedium

    The concept of metamedium, conceived in 1977 by American computer scientist Alan Kay (then at Xerox PARC) posits the idea that the resources of a computer can potentially be used to simulate and mix any media (image, text, sound, video, etc.) and create new ones.

  • MIT Media Lab

    The MIT Media Lab was created in 1985 by Nicholas Negroponte and Jerome Wiesner at the Massachusetts Institute of Technology. It is dedicated to research projects regarding design, interaction and technology. Many famous projects have been launched at the Media Lab, like Wired magazine, Processing software or the One Laptop Per Child (OLPC) initiative. Major figures of computer science and design like Neil Gershenfeld, Joi Ito, John Maeda, Marvin Misky and Ethan Zuckerman worked at the Media Lab.

  • Multimedia

    Developed by Bob Goldstein (USA) in 1966 and taken up in France since 1978 by François Billetdoux, the word multimedia is used to characterize works that combine several media: image, audio, film, video and other interactive contents. Today, its meaning has expanded to include the productions and objects relative to digital technologies.

  • Nerd

    Coined in the 1950s, the word “nerd” was initially a derogative term to qualify a person who is socially inept and fervently devoted to intellectual or scientific subjects. The synonymous term "geek" is more recent, and is associated with people involved in computer science and/or technology.

  • Object-Oriented Programming

    Object-oriented programming is a computer programming paradigm using base items named objects (classes) that embed their own internal structure, data and set of methods which define their own behavior and possible interactions with the rest of the program. Conceived in the 1960s, this type of language was developed in the early 1970s at Xerox PARC (with the notable contribution of the American computer scientist Alan Kay who played a large part in the creation of Smalltalk, one of the first object-oriented languages). Today, most programming languages use this paradigm (C++, Java, Python, Objective-C, PHP, etc.)

  • OCR

    Optical Character Recognition refers to a set of technologies used to translate text images (often printed documents that are scanned) into text files that can be manipulated (copied and pasted, searched, etc.).

  • Open data

    The term “open data” refers to data sets whose access and use are free, without technical, legal or financial restrictions. By extension, this term is also used to qualify current governmental policies (induced by websites like data.gov in the United States, and data.gouv.fr in France) aspiring to promote the development of an economy based on the reuse of public data.

  • Open source

    As opposed to the philosophy of free software, which is focused on the social consideration of user freedom, open source is a pragmatic programming methodology based on the effectiveness of collaborative work and source code sharing. The term was popularized by Eric Raymond, co-founder of the Open Source Initiative in 1998. 

  • OpenType

    Developed by Microsoft in 1996, OpenType is a file format that describes digital typefaces intended to enrich TrueType, a previous format created by Apple at the end of the 1980s. It adds numerous functionalities, such as increasing the maximum number of glyphs per font, support of non-Latin characters, and the ability to code ligatures, etc.

  • Operating System

    An operating system (OS) is a set of computer programs responsible for the proper communication between hardware resources (CPU, GPU, storage device, etc.) and the user. It notably acts as a proxy between a machine’s software and hardware.

  • PC

    The personal computer was invented at Xerox PARC in the 1970s by American computer scientist Alan Kay and his team, in reaction to the mainframe architecture that was prevalent at the time. The IBM PC, released in 1981, was the first personal computer to be sold in the millions. Thanks to its open architecture, it became the ancestor of all “PC-compatible” computers that were sold with Microsoft operating systems (first MS-DOS, then Windows).

  • PHP

    A free programming language released in 1994 for the dynamic generation of web pages from a server. PHP (Hypertext Preprocessor) usually interfaces with a relational database (MySQL) to output text files (HTML). It has been the most prevalent web server language since the late 1990s.

  • Pipe

    Used in the shell of Unix-like systems, the pipe (or pipeline) is a procedure for chaining commands so that the output of a previous process feeds the input of the next. Not to be confused with the Pipes service of the Yahoo! company which is a visual programming interface intended to sort and filter data off the Web.

  • Plain Text

    Plain text refers to a type of file format that only contains alphanumeric characters, without any information regarding layout (i.e. color, bold, typeface). The visual aspect of the text depends completely on the software that reads the file. Plain text is preferred for its ease of use and improved compatibility over various software and operating systems. It is mainly used to write source code for computer programs or texts using the Markdown syntax.

  • Plug-in

    A plug-in, also known as add-on or extension module, is a type of program which can be added to an existing software in order to extend its potentialities. These small programs are often released by teams other than the host software’s publisher and cannot be executed as standalones. The most used proprietary plug-in is probably Flash Player which allows the Web browser on which it is installed to display compatible animations.

  • Processing

    Released in 2001 by Benjamin Fry and Casey Reas, two students of John Maeda at the MIT Media Lab, Processing is an easily understood programming language, to assist artists, designers or students who need visual and interactive creation. Distributed under the GPL License, Processing is able to produce stand-alone applications or applets that can be displayed in any Web browser.

  • Program

    A computer program is an algorithm whose operations are translated into a programming language. It includes a series of instructions that intend to achieve one or several specific objectives. 

  • Programming

    Computer programming denotes a set of activities involved in digital programming, which means writing source code in specific programming languages. Source codes are then compiled into machine language (binary code) that can be executed with a computer.

  • Programming Language

    A programming language is a notation system, most often textual, used to write the source code of computer programs. Like natural languages, each programming language has its own alphabet, semantics, vocabulary and syntax rules.

  • Python

    Initiated in 1989 by software engineer Guido van Rossum (brother of Just van Rossum, co-founder of the type designers’ collective LettError), the object-oriented programming language Python has the essential specificity of being interpreted (as opposed to compiled languages). This means that the program’s instructions are translated into machine language during the execution. It ensures simplicity in writing and portability from one operating system to another. On a visual level, Python is mainly used for type design (OpenType format, Robotfont software, etc.).

  • Random

    In computer science, a random event is characterized by the theoretical impossibility of predicting its occurrence. Usually dictated by a law of probability, randomness differs from chance, the unpredictability of which depends on factors external to the observer.

  • Regular Expression

    In computer science, a regular expression (regex) is a character sequence that describes, according to a precise syntax, a set of possible glyph sequences. Regular expressions are particularly used for searching and replacing portions of text.

  • Rich Text

    Contrary to plain text, rich text file formats preserve and contain text enrichment, page layout and the inclusion of heterogeneous elements such as images. These files are created and edited with word processing software like Word or LibreOffice.

  • Scribus

    Released in 2001, Scribus is a layout software under GPL free license and is one of the only alternatives to InDesign, the software used by the vast majority of the graphic design industry. Le Tigre magazine (2006–2014) is designed with Scribus.

  • Script

    A script is a series of instructions which are intended to implement and direct the execution of computer programs. The term shell script is used to denote a program that interacts with a command-line interpreter.

  • Shell

    A shell is a computer program that includes a command-line interface for interacting with an operating system. Such programs were developed with Unix at the beginning of the 1970s and Bash is nowadays the most used shell (it is installed by default on most of GNU/Linux distributions and macOS).

  • Shortcode

    Introduced in 2008 with the release of the 2.5 version of the WordPress CMS, shortcodes are small pieces of code written between brackets that allow the administrator to execute simple instructions like media integration (video, etc.).

  • Software

    A software is a set of computer programs (and other operating information) that interact with the hardware of a computer. There are two main types of software: applications that allow a user to perform tasks, and so-called “system” software that enables the machine to run (e.g. printer drivers, network utilities).

  • Source Code

    The source code of a computer program is a text containing instructions written in one or several programming languages. Most of the time, source code is compiled to binary code to be executed (read) by the machine. Once compiled, the binary code is impossible to edit without the source code.

  • Standard

    A standard refers to an industrial norm. In the computer science field, standards allow broader compatibility between several pieces of software or hardware. The most important non-profit standardization organization is the W3C (World Wide Web Consortium), in charge of regulating the compatibility of Web technologies since 1994.

  • Streaming

    Streaming refers to the serving of data on digital networks in a continuous flow from a distant data provider to a client. This mode of uninterrupted media reception is in contrast to conventional downloading, which requires a complete file in order to open it.

  • SVG

    SVG (Scalable Vector Graphics) is an format used to describe vector images which are based on a markup syntax. It is widely used on the Web instead of raster images (based on pixels).

  • TED

    Initiated in California in 1984, the TED (Technology, Entertainment and Design) talks are organized by the Sapling Foundation to disseminate “ideas worth spreading.” TED talks have caused several controversies, notably due to the events’ high entry price and the format of the talks (i.e. the idea of turning serious issues into a “show”).

  • Teletype

    A teletype (or teleprinter) is a device invented in the 1910s that is able to emit and receive electrically encoded messages. The first computers had no screens, so they used teleprinters as the main input and output peripherals. These devices looked like a sort of typewriter with electronic controls that print user commands and computer answers on paper readouts.

  • Tracker

    In computer science, a tracker records a file’s location and its potential moves. This technique is notably used by BitTorrent, a file-sharing protocol that has adopted the peer-to-peer application architecture. The BitTorrent tracker helps to synchronize the electronic data transfer between users by locating the various parts of the downloaded file (the “bits” are gathered together to reconstruct a full copy of the file).

  • Turing Machine

    A Turing machine is a mathematical model invented by Alan Turing in 1936 that describes the functioning of calculating (computing) devices in order to give a precise definition of the concept of algorithm or “mechanical procedure.” It consists of an infinite ribbon divided into consecutive boxes, a head that can move, read and write symbols on the ribbon, a status register that memorizes the current state of the machine, and an action table that tells the machine which symbol to write on the ribbon. Any computing problem based on an algorithmic procedure can be solved by a Turing machine.

  • Unix

    Created in 1969 by Kenneth Thompson at Bell Labs, Unix is an operating system initially intended to run on mainframe computers whose main interface uses a shell (command-line interpreter). Unix gave birth to numerous other operating systems like GNU/Linux, macOS and iOS, whose uses have shifted to personal computing.

  • URL

    A URL (Uniform Resource Locator) is the Web address that allows a browser to locate an online resource (file, image, webpage, etc.) Formalized by Tim Berners-Lee at the beginning of the 1990s, the protocol that allows URLs to work is one of the key inventions of the World Wide Web.

  • UX

    UX (User experience) has been a popular term since the early 2000s to name all the elements that make up the “experience” of users of an application or website, including the marketing, purchase price, user interface, functionalities and domain name of the website.

  • Vector

    As opposed to raster (or bitmap, meaning comprised of pixels) images, vector file formats (images or fonts) only describe geometrical shapes (lines, Bézier curves, etc.) and their layout attributes (color, stroke, rotation, etc.). The SVG file format enables the integration of vector images onto webpages.

  • Vox-ATypI

    In an attempt to overcome the limits of the Thibaudeau Classification (1921), Maximilien Vox, publisher, critic, theoretician and typography historian developed a new classification method for typefaces in 1952. Adopted and completed by the Association Typographique Internationale (ATypI, the International Typography Association) in 1962, the Vox-ATypI classification includes eleven families of typefaces based on historical and visual criteria (humanists, garaldes, transitionals, didones, mechanistics, lineals, glyphics, scripts, graphics, blackletters and non-latins).

  • Web

    The World Wide Web (WWW) is a global publication and consultation environment based on a hypertext link system that connects pages to each other. It was developed in 1989 by Tim Berners-Lee and Robert Cailliau at CERN (European Organization for Nuclear Research) near Geneva. It is essential to distinguish it from the Internet, which consists of the hardware infrastructure and the ensemble of communication protocols between servers (TCP/IP, DNS), from the Web, which uses the Internet for the circulation of the three technical objects that comprise it: HTTP (HyperText Transfer Protocol, used for data transfer), URL (Uniform Resource Locator, for web address management) and HTML (HyperText Markup Language, which describes the content of webpages). Accessing the Web requires a dedicated software called a Web browser that reads and displays the HTML code sent by web servers. The word Web 2.0 emerged in the middle of the 2000s and refers to the mutation of user practices following the development of platforms based on behavior (Google AdSense) or “social networks,” that demand contributions and personal interactions (Flickr, MySpace, Facebook, Twitter, etc.).

  • Web Browser

    A Web browser is a type of software designed to access and render the pages of the World Wide Web. The word browser is inspired by Netscape Navigator, one of the most important browsers in the mid-1990s. Today, the main browsers are Google Chrome, Mozilla Firefox, Microsoft Edge and Apple Safari.

  • Web Server

    A Web server is a distant computer that is connected to the Internet network and is able to store and allow access to information and data over the Web (websites). Uploading files from a personal computer to a distant server is done via FTP (File Transfer Protocol).

  • Web Tracking

    Web tracking refers to all techniques (cookies, pixel tags, browsing histories, etc.) used to track and collect the browsing history and data of web users, often for direct or indirect commercial purposes. These techniques for collecting personal data are included in “free” services such as search engines, social media, or operating systems (OS). They generate most of the income of companies such as Facebook or Google (Alphabet).

  • Webapp

    A webapp (Web application) is a piece of software that runs directly in a Web browser and whose client interface is entirely made up of Web-specific programming languages (HTML, CSS, PHP, JavaScript, etc.) Contrary to “native” applications distributed in app stores (Apple App Store, Google Play, etc.), webapps are universal thanks to their compatibility with any modern browser.

  • Webfont

    Webfont is a set of typeface file formats (TrueType, OpenType, Embedded OpenType, WOFF, SVG, etc.) meant to be sent by a server then displayed on a client’s browser (through the CSS instruction @font-face). Announced in September 2016, the new Variable Font standard format will allow, in the long-term, to send only one file that will be able to generate an infinity of variations (weights, proportions, etc.)

  • WebGL

    Initiated at the end of the 2000s, the WebGL standard is used to display 3D graphical elements in a Web browser. This technology is based on the combination of the JavaScript program-ming language with JSON data format and OpenGL standard API.

  • Wiki

    A wiki is a type of website that enables its visitors to create and edit content in a collaborative way, with or without registering. The first wiki, WikiWikiWeb was released in 1995. The most famous example of this is the free, universal and multilingual encyclopedia Wikipedia, launched in 2001 by Jimmy Wales and Larry Sanger.

  • WIMP

    WIMP, (Windows, Icons, Menus and Pointing device) is an interface paradigm invented at Xerox PARC at the beginning of the 1970s by the American computer scientist Alan Kay and his team when developing the Xerox Alto, the first personal computer. In this system, the user interacts with the machine through images, abstractions of data or functions, which he activates with the pointing device. It is, nowadays, the main interface model used in the operating systems of personal computers (but not those of tactile devices like smartphones or tablets).

  • WYSIWYG

    Le terme WYSIWYG (What You See Is What You Get, « Ce que vous voyez est ce que vous obtenez ») qualifie l’interface graphique utilisateur d’un logiciel dont l’affichage à l’écran correspond au résultat obtenu à l’issue de la production du document (habituellement l’impression). Bravo, l’éditeur texte du Xerox Alto, sorti en 1973, est le premier logiciel WYSIWYG. Il s’agit aujourd’hui du paradigme d’interface dominant des logiciels dits de création comme InDesign, Illustrator ou Photoshop.

  • Xerox PARC

    Xerox PARC (Palo Alto Research Center) is a research and development center located in Palo Alto, California. Created in 1970 by the Xerox firm, specialized in laser printers, the Xerox PARC is, to a large extent, responsible for most of the contemporary personal computing paradigms: object-oriented programming, Ethernet network, graphical user interfaces, desktop metaphor, WYSIWYG, etc. Released in 1973, the Xerox Alto was one of the first personal computers with a graphical interface. After Steve Jobs visited Xerox PARC in 1979, a number of these innovations were used by Apple and implemented in the Lisa in 1983 and later, the Macintosh in 1984.

  • XML

    Acronym for Extensible Markup Language, XML is a generic markup language dedicated to the description and exchange of data between automated systems. Its syntax is called «extensible» because it allows its user to define various namespaces, i.e. languages with their own vocabulary and grammar, such as XHTML (web pages), RSS (news feeds), SVG (vector images), etc.