Chattanooga Software Center
 

Glossary of Computer Software Development Terminology

 

The terms are defined, as much as possible, using available standards. The source of such definitions appears immediately following the term or phrase in parenthesis, e.g. (NIST).

 

The source documents are listed at the bottom of this page.

 

 
  ABCDEFGHIJKLMNOPQRSTUVWXYZ  
     
 

ADC. analog-to-digital converter.

A piece of hardware that translates Analog (variable pressure) signals into Digital (absolute on and off combination) signals. Common uses of these devices include computer sound systems, which allow computers (digital) to transmit sound to speakers. (analog)

 

ALU. arithmetic logic unit.

See arithmetic logic unit

 

ANSI. American National Standards Institute.

 

ASCII. American Standard Code for Information Interchange.

A character list that specifies characters to correspond with byte values 0-127. The characters supported include letter numbers and the most widely used symbols, as well as some reserved characters. Since a byte is usually 8 bits, it supports values between 0-255, allowing many character sets (for example Latin-1 and UTF-8) that are ASCII-compatible, but support additional special characters.

 

abstraction. The separation of the logical properties of data or function from its implementation in a computer program. See: encapsulation, information hiding, software engineering.

This term refers to the design of abstracted software that is intended to be called by other software. The abstracted software is designed to perform certain functions, (such as connecting to another system, or reading certain data), but does not make it clear to the other software the specifics of how the operation is performed under thee hood.

 

access. (ANSI) To obtain the use of a resource.

 

access time. (ISO) The time interval between the instant at which a call for data is initiated and the instant at which the delivery of the data is completed.

 

accident. See: mishap.

 

accuracy. (IEEE) (1) A qualitative assessment of correctness or freedom from error. (2) A quantitative measure of the magnitude of error. Contrast with precision. (CDRH) (3) The measure of an instrument's capability to approach a true or absolute value. It is a function of precision and bias. See: bias, precision, calibration.

 

accuracy study processor. A software tool used to perform calculations or determine accuracy of computer manipulated program variables.

 

actuator. A peripheral [output] device which translates electrical signals into mechanical actions; e.g., a stepper motor which acts on an electrical signal received from a computer instructing it to turn its shaft a certain number of degrees or a certain number of rotations. See: servomechanism.

 

adaptive maintenance. (IEEE) Software maintenance performed to make a computer program usable in a changed environment. Contrast with corrective maintenance, perfective maintenance.

 

address. (1) A number, character, or group of characters which identifies a given device or a storage location which may contain a piece of data or a program step. (2) To refer to a device or storage location by an identifying number, character, or group of characters.

 

addressing exception. (IEEE) An exception that occurs when a program calculates an address outside the bounds of the storage available to it.

 

algorithm. (IEEE) (1) A finite set of well-defined rules for the solution of a problem in a finite number of steps. (2) Any sequence of operations for performing a specific task.

 

algorithm analysis. (IEEE) A software V&V task to ensure that the algorithms selected are correct, appropriate, and stable, and meet all accuracy, timing, and sizing requirements.

 

alphanumeric. Pertaining to a character set that contains letters, digits, and usually other characters such as punctuation marks.

 

American National Standards Institute. 11 West 42nd Street, New York, N.Y. 10036. An organization that coordinates the development of U.S. voluntary national standards for nearly all industries. It is the U.S. member body to ISO and IEC. Information technology standards pertain to programming languages, electronic data interchange, telecommunications and physical properties of diskettes, cartridges and magnetic tapes.

 

American Standard Code for Information Interchange. A seven bit code adopted as a standard to represent specific data characters in computer systems, and to facilitate interchange of data between various machines and systems. Provides 128 possible characters, the first 32 of which are used for printing and transmission control. Since common storage is an 8-bit byte [256 possible characters] and ASCII uses only 128, the extra bit is used to hold a parity bit or create special symbols. See: extended ASCII.

 

analog. Pertaining to data [signals] in the form of continuously variable [wave form] physical quantities; e.g., pressure, resistance, rotation, temperature, voltage. Contrast with digital.

 

analog device. (IEEE) A device that operates with variables represented by continuously measured quantities such as pressures, resistances, rotations, temperatures, and voltages.

 

analog-to-digital converter. Input related devices which translate an input device's [sensor] analog signals to the corresponding digital signals needed by the computer. Contrast with DAC [digital-to-analog converter]. See: analog, digital.

 

analysis. (1) To separate into elemental parts or basic principles so as to determine the nature of the whole. (2) A course of reasoning showing that a certain result is a consequence of assumed premises. (3) (ANSI) The methodical investigation of a problem, and the separation of the problem into smaller related units for further detailed study.

 

anomaly. (IEEE) Anything observed in the documentation or operation of software that deviates from expectations based on previously verified software products or reference documents. See: bug, defect, error, exception, fault.

 

application program. See: application software.

 

application software. (IEEE) Software designed to fill specific needs of a user; for example, software for navigation, payroll, or process control. Contrast with support software; system software.

 

architectural design. (IEEE) (1) The process of defining a collection of hardware and software components and their interfaces to establish the framework for the development of a computer system. See: functional design. (2) The result of the process in (1). See: software engineering.

 

architecture. (IEEE) The organizational structure of a system or component. See: component, module, subprogram, routine.

 

archival database. (ISO) An historical copy of a database saved at a significant point in time for use in recovery or restoration of the database.

 

archive. (IEEE) A lasting collection of computer system data or other records that are in long term storage.

 

archive file. (ISO) A file that is part of a collection of files set aside for later research or verification, for security purposes, for historical or legal purposes, or for backup.

 

arithmetic logic unit. The [high speed] circuits within the CPU which are responsible for performing the arithmetic and logical operations of a computer.

A building block of a CPU. An ALU is an electrical circuit that is arranged to perform a specialized arithmetic operation, such as addition or subtraction or multiplication, etc. Most CPUs would have a variety of ALUs allowing for the most common and/or simple operations to perform quickly, leaving the more complex operations to be completed by the software and/or operating system.

 

arithmetic overflow. (ISO) That portion of a numeric word that expresses the result of an arithmetic operation, by which the length of the word exceeds the word length of the space provided for the representation of the number. See: overflow, overflow exception.

 

arithmetic underflow. (ISO) In an arithmetic operation, a result whose absolute value is too small to be represented within the range of the numeration system in use. See: underflow, underflow exception.

 

array. (IEEE) An n-dimensional ordered set of data items identified by a single name and one or more indices, so that each element of the set is individually addressable; e.g., a matrix, table, or vector.

 

as built. (NIST) Pertaining to an actual configuration of software code resulting from a software development project.

 

assemble. See: assembling.

 

assembler. (IEEE) A computer program that translates programs [source code files] written in assembly language into their machine language equivalents [object code files]. Contrast with compiler, interpreter. See: cross-assembler, cross-compiler.

 

assembling. (NIST) Translating a program expressed in an assembly language into object code.

 

assembly code. See: assembly language.

 

assembly language. (IEEE) A low level programming language, that corresponds closely to the instruction set of a given computer, allows symbolic naming of operations and addresses, and usually results in a one-to-one translation of program instructions [mnemonics] into machine instructions. See: low-level language.

 

assertion. (NIST) A logical expression specifying a program state that must exist or a set of conditions that program variables must satisfy at a particular point during program execution.

 

assertion checking. (NIST) Checking of user- embedded statements that assert relationships between elements of a program. An assertion is a logical expression that specifies a condition or relation among program variables. Tools that test the validity of assertions as the program is executing or tools that perform formal verification of assertions have this feature. See: instrumentation; testing, assertion.

 

asynchronous. Occurring without a regular time relationship, i.e., timing independent.

 

asynchronous transmission. A timing independent method of electrical transfer of data in which the sending and receiving units are synchronized on each character, or small block of characters, usually by the use of start and stop signals. Contrast with synchronous transmission.

 

audit. (1) (IEEE) An independent examination of a work product or set of work products to assess compliance with specifications, standards, contractual agreements, or other criteria. See: functional configuration audit, physical configuration audit. (2) (ANSI) To conduct an independent review and examination of system records and activities in order to test the adequacy and effectiveness of data security and data integrity procedures, to ensure compliance with established policy and operational procedures, and to recommend any necessary changes. See: computer system audit, software audit.

 

audit trail. (1) (ISO) Data in the form of a logical path linking a sequence of events, used to trace the transactions that have affected the contents of a record. (2) A chronological record of system activities that is sufficient to enable the reconstruction, reviews, and examination of the sequence of environments and activities surrounding or leading to each event in the path of a transaction from its inception to output of final results.

 

auxiliary storage. Storage device other than main memory [RAM]; e.g., disks and tapes.

 
     
 

Source Documents

 

  1. The New IEEE Standard Dictionary of Electrical and Electronics Terms, IEEE Std. 100-1992.
  2. IEEE Standards Collection, Software Engineering, 1994 Edition, published by the Institute of Electrical and Electronic Engineers Inc.
  3. National Bureau of Standards [NBS] Special Publication 500-75 Validation, Verification, and Testing of Computer Software, 1981.
  4. Federal Information Processing Standards [FIPS] Publication 101, Guideline For Lifecycle Validation, Verification, and Testing of Computer Software, 1983.
  5. Federal Information Processing Standards [FIPS] Publication 105, Guideline for Software Documentation Management, 1984.
  6. American National Standard for Information Systems, Dictionary for Information Systems, American National Standards Institute, 1991.
  7. FDA Technical Report, Software Development Activities, July 1987.
  8. FDA Guide to Inspection of Computerized Systems in Drug Processing, 1983.
  9. FDA Guideline on General Principles of Process Validation, May 1987.
  10. Reviewer Guidance for Computer Controlled Medical Devices Undergoing 510(k) Review, Office of Device Evaluation, CDRH, FDA, August 1991.
  11. HHS Publication FDA 90-4236, Preproduction Quality Assurance Planning.
  12. MIL-STD-882C, Military Standard System Safety Program Requirements, 19JAN1993.
  13. International Electrotechnical Commission, International Standard 1025, Fault Tree Analysis.
  14. International Electrotechnical Commission, International Standard 812, Analysis Techniques for System Reliability - Procedure for Failure Mode and Effects Analysis [FMEA].
  15. FDA recommendations, Application of the Medical Device GMP to Computerized Devices and Manufacturing Processes, May 1992.
  16. Pressman, R., Software Engineering, A Practitioner's Approach, Third Edition, McGraw-Hill, Inc., 1992.
  17. Myers, G., The Art of Software Testing, Wiley Interscience, 1979.
  18. Beizer, B., Software Testing Techniques, Second Edition, Van Nostrand Reinhold, 1990.
  19. Additional general references used in developing some definitions are:
  20. Bohl, M., Information Processing, Fourth Edition, Science Research Associates, Inc., 1984.
  21. Freedman, A., The Computer Glossary, Sixth Edition, American Management Association, 1993.
  22. McGraw-Hill Electronics Dictionary, Fifth Edition, 1994, McGraw-Hill Inc.
  23. McGraw-Hill Dictionary of Scientific & Technical Terms, Fifth Edition, 1994, McGraw-Hill Inc..
  24. Webster's New Universal Unabridged Dictionary, Deluxe Second Edition, 1979.


The bulk of this information was obtained from FDA.gov.

 

BIOS. basic input/output system.

 

bps. bits per second.

 

band. Range of frequencies used for transmitting a signal. A band can be identified by the difference between its lower and upper limits, i.e. bandwidth, as well as by its actual lower and upper limits; e.g., a 10 MHz band in the 100 to 110 MHz range.

 

bandwidth. The transmission capacity of a computer channel, communications line or bus. It is expressed in cycles per second [Hz], and also is often stated in bits or bytes per second. See: band.

 

bar code. (ISO) A code representing characters by sets of parallel bars of varying thickness and separation that are read optically by transverse scanning.

 

baseline. (NIST) A specification or product that has been formally reviewed and agreed upon, that serves as the basis for further development, and that can be changed only through formal change control procedures.

 

BASIC. An acronym for Beginners All-purpose Symbolic Instruction Code, a high-level programming language intended to facilitate learning to program in an interactive environment.

 

basic input/output system. Firmware that activates peripheral devices in a PC. Includes routines for the keyboard, screen, disk, parallel port and serial port, and for internal services such as time and date. It accepts requests from the device drivers in the operating system as well from application programs. It also contains autostart functions that test the system on startup and prepare the computer for operation. It loads the operating system and passes control to it.

 

batch. (IEEE) Pertaining to a system or mode of operation in which inputs are collected and processed all at one time, rather than being processed as they arrive, and a job, once started, proceeds to completion without additional input or user interaction. Contrast with conversational, interactive, on-line, real time.

 

batch processing. Execution of programs serially with no interactive processing. Contrast with real time processing.

 

baud. The signalling rate of a line. It's the switching speed, or number of transitions [voltage or frequency change] made per second. At low speeds bauds are equal to bits per seconds; e.g., 300 baud is equal to 300 bps. However, one baud can be made to represent more than one bit per second.

 

benchmark. A standard against which measurements or comparisons can be made.

 

bias. A measure of how closely the mean value in a series of replicate measurements approaches the true value. See: accuracy, precision, calibration.

 

binary. The base two number system. Permissible digits are "0" and "1".

 

bit. A contraction of the term binary digit. The bit is the basic unit of digital data. It may be in one of two states, logic 1 or logic 0. It may be thought of as a switch which is either on or off. Bits are usually combined into computer words of various sizes, such as the byte.

 

bits per second. A measure of the speed of data transfer in a communications system.

 

black-box testing. See: testing, functional.

 

block. (ISO) (1) A string of records, words, or characters that for technical or logical purposes are treated as a unity. (2) A collection of contiguous records that are recorded as a unit, and the units are separated by interblock gaps. (3) A group of bits or digits that are transmitted as a unit and that may be encoded for error-control purposes. (4) In programming languages, a subdivision of a program that serves to group related statements, delimit routines, specify storage allocation, delineate the applicability of labels, or segment parts of the program for other purposes. In FORTRAN, a block may be a sequence of statements; in COBOL, it may be a physical record.

 

test
Pineapple Code
Chattanooga Software Center • 1-423-821-3463 • 3821 Saint Elmo Avenue, Chattanooga, TN 37409 USA

Copyright © 2024 All Rights Reserved. Privacy, Terms, Glossary and Staff