A workflow consists of an orchestrated and repeatable pattern of business activity enabled by the systematic organization of resources into processes that transform materials, provide services, or process information. It can be depicted as a sequence of operations, declared as work of a person or group, an organization of staff, or one or more simple or complex mechanisms.
From a more abstract or higher-level perspective, workflow may be a view or representation of real work, thus serving as a virtual representation of actual work. The flow being described may refer to a document, service or product that is being transferred from one step to another. 
While the assembly line remains the most famous implementation of a workflow from this era, the early thinking around work was far more sophisticated than is often commonly understood. The notion of flow was more than a sequential breakdown of processing. The common conceptual models of modern operations research, including job shops and queuing systems (Markov chains), can be found in early forms in early 20th century industry.
Information-based workflows began to grow during this era, although the concept of an information flow lacked flexibility. A particularly influential figure was Melvil Dewey (inventor of the eponymous Dewey Decimal System). This era is thus identified with the simplest notions of workflow optimization: throughput and resource utilization.
The cultural impact of workflow optimization during this era can be understood through films such as Chaplin's classic Modern Times. These concepts did not stay confined to the shop floor. One magazine invited housewives to puzzle over the fastest way to toast three slices of bread on a one-side, two-slice grill. The book Cheaper by the Dozen introduced the emerging concepts to the context of family life.
Maturation and growth
The invention of the typewriter and the copier helped spread the study of the rational organization of labor from the manufacturing shop floor to the office. Filing systems and other sophisticated systems for managing physical information flows evolved. Two events provided a huge impetus to the development of formalized information workflows. First, the field of optimization theory matured and developed mathematical optimization techniques. Second, World War II and the Apollo program were unprecedented in their demands for the rational organization of work.
The classic management book The Organization Man by William H. Whyte, published in 1956, culturally captured the nature of work in this era.
In 1995, some people in the publishing industry studied how traditional publishing processes could be re-engineered and streamlined into digital processes in order to reduce lagtime, as well as substantial printing and shipping costs for delivering print copies of books and journals to warehouses and subscribers. The term electronic workflow was used to describe the publishing process, from online delivery of digital manuscripts to the posting of content on the web for online access.
During the 1980s, two aspects of workflow organization drew heavy criticism. First, the methods pioneered by Taylor modeled humans as simple automata. The classical industrial-style organization of work was critiqued as being both dehumanizing and suboptimal in its use of the potential of human beings. Maslow's hierarchy of needs, which describes human needs for self-actualization and creative engagement in work, became a popular tool in this critique. This issue was acknowledged, but did not gain much traction otherwise.
The second critique had to do with quality. Workflows optimized for a particular time became inflexible as work conditions changed. Quality, in both analytic and synthetic manifestations, transformed the nature of work through a variety of movements ranging from total quality management to Six Sigma, then to more qualitative notions of business process re-engineering (Hammers and Champy, 1991). Under the influence of the quality movement, workflows became the subject of much scrutiny and optimization efforts. Acknowledgement of the dynamic and changing nature of the demands on workflows came in the form of recognition of the phenomena associated with critical paths and moving bottlenecks.
The experiences with the quality movement made it clear that information flows are fundamentally different from the mass and energy flows; this inspired the first forms of rational workflows. The low cost and adaptability of information flows were seen as enabling workflows that were at once highly rational in their organization and highly flexible, adaptable and responsive. These insights unleashed a whole range of information technology at workflows in manufacturing, services and pure information work. Flexible manufacturing systems, just-in-time inventory management, and other highly agile and adaptable systems of workflow are products of this era.
The concept of workflow is closely related to several fields in operations research and other areas that study the nature of work, either quantitatively or qualitatively, such as artificial intelligence (in particular, the sub-discipline of AI planning) and ethnography. The term workflow is more commonly used in particular industries, such as printing and professional domains, where it may have particular specialized meanings.
Processes: A process is a more general notion than workflow and can apply to physical or biological processes, for instance; whereas a workflow is typically a process or collection of processes described in the context of work, such as all processes occurring in a machine shop.
Planning and scheduling: A plan is a description of the logically necessary, partially ordered set of activities required to accomplish a specific goal given certain starting conditions. A plan, when augmented with a schedule and resource allocation calculations, completely defines a particular instance of systematic processing in pursuit of a goal. A workflow may be viewed as an (often optimal or near-optimal) realization of the mechanisms required to execute the same plan repeatedly.
Flow control is a control concept applied to workflows, to distinguish from static control of buffers of material or orders, to mean a more dynamic control of flow speed and flow volumes in motion and in process. Such orientation to dynamic aspects is the basic foundation to prepare for more advanced job shop controls, such as just-in-time or just-in-sequence.
In-transit visibility is a monitoring concept that applies to transported material as well as to work in process or work in progress, i.e., workflows.
The following examples illustrate the variety of workflows seen in various contexts:
In machine shops, particularly job shops and flow shops, the flow of a part through the various processing stations is a work flow.
Insurance claims processing is an example of an information-intensive, document-driven workflow.
Wikipedia editing can be modeled as a stochastic workflow.
The Getting Things Done system is a model of personal workflow management for information workers.
In software development, support and other industries, the concept of follow-the-sun describes a process of passing unfinished work across time zones.
In traditional offset and digital printing, the concept of workflow represents the process, people and usually software technology (RIPs raster image processors or DFE digital front end) controllers that play a part in pre/post processing of print-related files. e.g. PDF pre-flight checking to make certain that fonts are embedded or that the imaging output to plate or digital press will be able to render the document intent properly for the image-output capabilities of the press that will print the final image.
In Scientific experiments, the overall process (tasks and data flow) can be described as a Directed Acyclic Graph (DAG). This DAG is referred to as a workflow, e.g. Brain Imaging workflows.
In healthcare data analysis, a workflow can be used to represent a sequence of steps which compose a complex data analysis (data-search and data-manipulation steps).
In Service-oriented architectures an application can be represented through an executable workflow, where different, possibly geographically distributed, service components interact to provide the corresponding functionality, under the control of a Workflow Management System.
Features and phenomenology
Modeling: Workflow problems can be modeled and analyzed using graph-based formalisms like Petri nets.
Measurement: Many of the concepts used to measure scheduling systems in operations research are useful for measuring general workflows. These include throughput, processing time, and other regular metrics.
Scientific workflow system: Found wide acceptance in the fields of bioinformatics and cheminformatics in the early 2000s, where they successfully met the need for multiple interconnected tools, handling of multiple data formats and large data quantities. Also, the paradigm of scientific workflows was close to the well-established tradition of Perl programming in life-science research organizations, so this adoption represented a natural step forward towards a more structured infrastructure setup.
Human-machine interaction: Several conceptualizations of mixed-initiative workflows have been studied, particularly in the military, where automated agents play roles just as humans do. For innovative, adaptive, collaborative human work the techniques of human interaction management are required.
Workflow analysis: Workflow systems allow users to develop executable processes with no familiarity with formal programming concepts. Automated workflow analysis techniques can help users analyze the properties of user workflows to conduct verification of certain properties before executing them, e.g. analyze flow control or data flow. Examples of tools based on formal analysis frameworks have been developed and used for the analysis of scientific workflows and can be extended to the analysis of other types of workflows.
Workflow improvement theories
The key driver to gain benefit from the understanding of the workflow process in a business context is that the throughput of the workstream path is modelled in such a way as to evaluate the efficiency of the flow route through internal silos with a view to increasing discrete control of uniquely identified business attributes and rules and reducing potential low efficiency drivers. Evaluation of resources, both physical and human is essential to evaluate hand-off points and potential to create smoother transitions between tasks. Several workflow improvement theories have been proposed and implemented in the modern workplace. These include:
As a way of bridging the gap between the two, significant effort is being put into defining workflow patterns that can be used to compare different workflow engines across both of these domains.
A workflow can usually be described using formal or informal flow diagramming techniques, showing directed flows between processing steps. Single processing steps or components of a workflow can basically be defined by three parameters:
input description: the information, material and energy required to complete the step
transformation rules, algorithms, which may be carried out by associated human roles or machines, or a combination
output description: the information, material and energy produced by the step and provided as input to downstream steps.
Components can only be plugged together if the output of one previous (set of) component(s) is equal to the mandatory input requirements of the following component. Thus, the essential description of a component actually comprises only in- and output that are described fully in terms of data types and their meaning (semantics). The algorithms' or rules' description need only be included when there are several alternative ways to transform one type of input into one type of output – possibly with different accuracy, speed, etc.
When the components are non-local services that are invoked remotely via a computer network, such as Web services, additional descriptors (such as QoS and availability) also must be considered.
Many software systems exist to support workflows in particular domains. Such systems manage tasks such as automatic routing, partially automated processing and integration between different functional software applications and hardware systems that contribute to the value-addition process underlying the workflow.
^Service-Oriented Architecture and Business Process Choreography in an Order Management Scenario: Rationale, Concepts, Lessons Learned, ACM.org
^Workflow management for soft real-time interactive applications in virtualized environments, Elsevier.com
^Curcin, V.; Ghanem, M.; Guo, Y. (2010). "The design and implementation of a workflow analysis tool". Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences368 (1926): 4193. doi:10.1098/rsta.2010.0157.edit
Ryan K. L. Ko, Stephen S. G. Lee, Eng Wah Lee (2009) Business Process Management (BPM) Standards: A Survey. In: Business Process Management Journal, Emerald Group Publishing Limited. Volume 15 Issue 5. ISSN 1463-7154. PDF