Access controls
Access controls define who can view or use systems, files or workflows based on set permissions. They restrict unauthorized access and help enforce security policies and compliance.
Explore commonly used Workload Automation and Job Scheduling terms.
Access controls define who can view or use systems, files or workflows based on set permissions. They restrict unauthorized access and help enforce security policies and compliance.
An active-active cluster uses multiple nodes that run at the same time to process workloads. It improves uptime by balancing traffic and removing single points of failure.
Ad-hoc file transfer is a one-time, user-initiated file exchange without predefined workflows. It allows quick sharing but still requires security controls like encryption and access limits.
AES is a widely used symmetric encryption standard that protects sensitive data in transit and at rest. It replaced older methods like 3DES due to stronger security and better performance.
Air-gapped networks are systems physically isolated from external connections. This separation prevents remote access and helps protect highly sensitive data.
ANSI X.9 is a set of financial industry standards that define secure electronic transactions. It covers areas like encryption, key management and financial messaging.
Antivirus scanning checks files for malware before or after transfer using known threat patterns. It helps protect systems and ensures files are safe to use or share.
AS3 is a file-based protocol that sends EDI data over FTP or FTPS using a store-and-forward model. It allows partners to retrieve messages asynchronously instead of relying on real-time delivery.
AS1 is an EDI protocol that sends encrypted and signed messages over email using SMTP and S/MIME. It supports secure business data exchange with non-repudiation.
AS2 is an HTTP-based protocol for securely transmitting EDI data over the internet. It uses encryption, digital signatures and receipts to confirm delivery and protect message integrity.
AS4 is a web services-based standard for secure B2B data exchange using SOAP and ebMS 3.0. It supports reliable messaging, scalability and interoperability across enterprise systems.
An API is a set of rules that allows software systems to communicate and exchange data. It enables automation, integration and workflow execution across platforms.
Audit logging captures system events like file transfers, logins and configuration changes. It provides visibility for monitoring activity, investigating issues and meeting compliance requirements.
An audit trail is a time-ordered record of actions taken within a system or workflow. It shows who did what and when to support accountability and traceability.
Auto scaling automatically adjusts system resources based on workload demand. It helps maintain performance during spikes while reducing unused capacity during low activity.
Automated alerts are notifications triggered by predefined system events or conditions. They help teams respond quickly to failures, anomalies or completed processes.
Automated file transfer moves data based on schedules or event triggers without manual steps. It improves consistency and reduces errors in recurring workflows.
B2B file transfer is the secure exchange of data between organizations using standardized protocols. It supports business processes like supply chains, payments and reporting.
B2B integration connects systems across organizations to exchange data and automate processes. It improves coordination between partners while reducing manual handling.
Bandwidth throttling intentionally limits data transfer speed across a network. It helps control congestion and ensures other applications maintain stable performance.
Centralized management uses a single platform to control file transfers and related processes. It simplifies administration while improving visibility, consistency and policy enforcement.
A certificate spill occurs when a private key or digital certificate is exposed unintentionally. This compromises encrypted communications and can allow unauthorized access or impersonation.
A certificate spill occurs when a private key or digital certificate is exposed unintentionally. This compromises encrypted communications and can allow unauthorized access or impersonation.
CFTP is a certification that validates expertise in secure file transfer technologies and practices. It demonstrates knowledge of protocols, security controls and compliance requirements.
Checkpoint restart allows file transfers to resume from the last successful point after interruption. It avoids restarting from the beginning and saves time and bandwidth.
A clear text password is stored or transmitted in an unencrypted, readable format. This creates significant security risk because it can be easily intercepted or exposed.
Cloud MFT is a managed file transfer solution hosted in a cloud environment. It enables secure, scalable and automated data exchange without on-premises infrastructure.
Cloud native refers to applications designed specifically to run in cloud environments. These systems use scalable, distributed architectures for flexibility and rapid updates.
Compression reduces file size before transmission to improve transfer speed and efficiency. It also lowers storage requirements and optimizes network usage.
A connector links systems or applications to enable data exchange between them. It supports integration by allowing automated communication without manual intervention.
A control file contains instructions that define how other files should be processed or transferred. It ensures consistency by guiding workflows, validation and routing steps.
Cut-off time is the deadline for processing or accepting files within a defined window. Files received after this point are typically handled in the next cycle.
CRC is an error-detection method that generates a value based on file contents. It’s used to verify data integrity by identifying accidental changes during transmission.
A DEP is an entity authorized to send or receive files within a transfer system. It represents a trusted participant in structured data exchanges.
DLP is a security approach that monitors and blocks unauthorized access or sharing of sensitive data. It helps prevent leaks and supports compliance with data protection requirements.
Data portability is the ability to move data between systems in a usable format. It ensures information can be transferred without losing structure or meaning.
A DPIA is a process that evaluates how personal data is handled in a system or project. It identifies privacy risks and defines measures to address them.
Data replication copies information across multiple locations to maintain consistency and availability. It supports backup, redundancy and system performance.
Data transfer is the movement of digital information between systems or locations. It enables communication, integration and automated workflows across environments.
Data transformation converts information from one format or structure into another. It ensures compatibility between systems and supports processing or analysis.
Decompression restores compressed data back to its original size and format. It allows files to be opened and used after transfer or storage optimization.
Decryption converts encoded data back into readable form using a key. It allows authorized users to access protected information securely.
The defense-in-depth model uses multiple layers of security controls across systems and networks. Each layer adds protection to reduce the impact of potential threats.
A DMZ is a network segment that separates internal systems from external networks. It provides controlled access while reducing exposure to external threats.
Deployment is the process of installing and configuring software in a live environment. It makes applications available for use within existing systems and workflows.
Deprovisioning removes user access, credentials or system permissions when no longer needed. It helps prevent unauthorized access and supports compliance.
DES is a legacy symmetric encryption algorithm that uses a short key to secure data. It’s no longer considered safe and has been replaced by stronger methods like AES.
Digital transformation is the adoption of technology to improve business processes and operations. It drives efficiency, innovation and better user experiences.
Disaster recovery is a set of processes for restoring systems and data after disruption. It helps organizations resume operations with minimal downtime and loss.
A demilitarized zone (DMZ) gateway is a secure intermediary that routes external traffic to internal systems without direct exposure. It protects internal resources by controlling and isolating incoming connections.
A DMZ proxy relays requests between external users and internal servers through a protected network layer. It hides internal systems and reduces the risk of direct access.
Downtime is the period when a system or service is unavailable or not functioning. It can result from maintenance, failures or external disruptions.
Drummond certified indicates that a product has passed interoperability and security testing for specific protocols. It confirms compatibility with other certified systems in real-world use.
Elasticity is the ability of a system to adjust resources automatically based on demand. It scales up during high usage and scales down when demand decreases.
EDI is the structured exchange of business documents between systems using standardized formats. It replaces manual processes like paper-based transactions.
ESD delivers software and updates over a network instead of physical media. It enables faster distribution and easier deployment to multiple users.
Encryption converts data into an unreadable format to protect it from unauthorized access. It secures information during storage and transmission.
End-to-end encryption protects data from sender to recipient without intermediate access. Only the intended parties can decrypt and read the content.
An endpoint is a system or location where data enters or exits during a transfer. It can include servers, applications or external partner systems.
EAI connects internal applications so they can share data and coordinate processes. It improves efficiency by enabling systems to work together without manual effort.
EiPaaS is a cloud-based platform that connects enterprise applications and data sources. It supports scalable integration across hybrid and distributed environments.
ERP integration connects enterprise resource planning systems with other applications or data sources. It enables consistent data flow and improves operational visibility.
ETL automation handles extracting, transforming and loading data without manual input. It prepares information for analysis, reporting or system use.
Event-driven file transfer initiates data movement when specific conditions occur. It reacts to triggers like file creation or system updates instead of fixed schedules.
External authentication verifies user identity through third-party systems like SAML or LDAP. It removes the need to store credentials within the application.
External file transfer moves data between internal systems and outside entities. It involves partners, vendors or customers beyond the organization’s network.
Extreme file transfer handles very large datasets or high-speed data movement across networks. It’s designed for performance in demanding or long-distance scenarios.
The FDIC is a US government agency that protects deposits at insured financial institutions. It also supervises banks to maintain trust in the financial system.
File integrity checking verifies that data hasn’t been altered or corrupted. It uses hashes or checksums to confirm consistency during transfer or storage.
File name filters use patterns or rules to include or exclude specific files during transfers. They help ensure only the correct files are selected for processing.
File renaming automatically changes file names based on defined rules during workflows. It helps enforce naming standards and maintain consistency across systems.
File sharing is the process of providing access to digital files between users or systems. It supports collaboration and controlled data distribution.
File transfer automation uses predefined rules to move files without manual input. It increases reliability and ensures consistent execution of transfer tasks.
FTP is a standard network protocol used to transfer files between systems over a network. It doesn’t use encryption, which makes it less secure than modern alternatives.
File transfer security certification verifies that a system meets established security standards. It demonstrates the ability to protect data during transmission.
A file transfer workflow is a structured sequence of steps that governs how files are processed and moved. It automates tasks like routing, validation and error handling.
FIPS 140-2 is a US government standard that defines security requirements for cryptographic modules. It’s being phased out in favor of the updated FIPS 140-3.
FIPS 140-2 compliance means a system follows the security requirements defined by the FIPS 140-2 standard. It ensures approved methods are used for protecting sensitive data.
FIPS 140-3 is the current US standard for validating cryptographic modules. It introduces updated requirements aligned with international security frameworks.
FIPS compliant describes systems that meet US government standards for data protection and cryptography. It indicates the use of approved methods for securing sensitive information.
FIPS validated describes cryptographic modules that have been officially tested and approved by the CMVP. It confirms they meet required security standards for encryption.
A firewall monitors and controls network traffic based on defined security rules. It blocks unauthorized access while allowing trusted communication.
A flow-based system processes data through a sequence of connected steps or tasks. It enables structured automation and efficient execution of workflows.
A folder-based system organizes files within a hierarchical directory structure. It helps users manage and locate data using grouped storage.
FTPS is a secure version of FTP that uses SSL or TLS to encrypt data in transit. It protects files and credentials during transmission across networks.
FTP with PGP combines file transfer protocol with encryption from Pretty Good Privacy. It protects file contents even when using an otherwise unencrypted method.
GDPR is a European Union law that governs how personal data is collected and processed. It gives individuals rights over their data and enforces strict privacy rules.
GnuPG is an open-source tool that implements the OpenPGP standard for encryption and signing. It secures files and communications using key-based cryptography.
GLBA is a US law that requires financial institutions to protect customer data. It also mandates transparency in how personal information is shared.
Granular permissions allow precise control over access to specific files, actions or system functions. They enable organizations to assign detailed rights based on user needs.
Guaranteed delivery ensures that files reach their destination without loss or corruption. It uses confirmations and retry mechanisms to verify successful transfer.
Headless file transfer operates without a graphical interface or user interaction. It runs through scripts, APIs or command-line tools for automated execution.
High availability refers to systems designed to remain operational with minimal interruption. It relies on redundancy and failover to maintain continuous service.
HIPAA compliance involves meeting US regulations for protecting healthcare data. It requires safeguards to secure sensitive patient information during use and transfer.
HTTP file transfer uses the hypertext transfer protocol to move files over the web. It enables simple uploads and downloads through browsers or web-based tools.
HTTPS file transfer secures file movement using encrypted HTTP connections. It protects data in transit through SSL or TLS encryption.
Hybrid architecture combines on-premises systems with cloud environments in a single setup. It provides flexibility for managing workloads across different infrastructures.
Hybrid MFT uses both cloud-based and on-premises systems for file transfers. It allows organizations to balance control, scalability and compliance needs.
An inbound connection is a request from an external source to access an internal system. It’s commonly used for receiving files from partners or external users.
Integration connects systems or applications so they can exchange data and function together. It enables automated workflows and improves coordination across environments.