Data is an organisation’s most valuable asset. Whether compiled over time or acquired from third parties, it underpins every business activity, and every decision. Its loss or corruption can be costly to rectify, at best, and in the case of a catastrophic event can even threaten the enterprise’s ongoing viability.
Research by IBM revealed that, in 2021, the average cost of a data breach originating with compromised credentials was $4.37m. The cost of breaches in which remote working was a factor was, on average, $1.07m higher.
Yet, data can’t be locked away. When it’s inaccessible, it’s worthless. So, while it’s habitually stored in a single location – a data center – it may be processed using desktop or mobile apps, web apps or IoT devices.
Securing data center infrastructure
Keeping track of the data as it moves between platforms can be difficult, and so can diagnosing problems since, unless organizations maintain custody of every part of that process, they must trust their digital assets to unseen and often unknown developers and service providers.
Can they really be sure their contractor isn’t using components with known vulnerabilities, particularly if they’re working to a budget? Owasp has been drawing attention to this attack vector for years and, as of August 2021, it still sits at number five in its list of 10 most commonly encountered exploits.
Mobile applications and vulnerabilities
Data can be lost or corrupted by its owners’ own actions, but it is equally – and frequently – the target of malicious actors. Despite the code checking performed by leading app stores and cloud service providers, vulnerabilities are common.
High-risk vulnerabilities were found in 38% of iOS and 43% of Android applications, with insecure data storage being the most common issue (present in 76% of mobile applications). Beyond this, data can be put at risk through the use of insecure wireless networks, installation of unsanctioned software on employer-provided devices and loss or theft of mobile devices themselves.
Common threats of data breach
The threats that organizations face, and against which they must mitigate, are many and varied. They include, but are not limited to:
- SQL injection, which can expose data and in some cases leave systems vulnerable to unauthorized remote control. In March 2021, Wordfence found that over 600,000 sites running a popular WordPress plugin were vulnerable to potential extraction of sensitive database information as a result of a Time-Based Blind SQL Injection.
- Cross-site scripting, which can give malicious actors superuser-level access to sensitive systems, potentially resulting in an ability to maintain long-term access to data, and to install their own malicious code. Cross-site scripting was the cause of 10% of blocked traffic across Verizon Media’s content delivery network in Q4 2020.
- Buffer overflow attacks, which overload a system, potentially resulting in a system crash. Dover Microsystems calls them “one of the most common and severe types of software vulnerabilities, impacting every system, regardless of industry or application”, of which 59 new examples were found in just the first two months of 2021.
- Security misconfiguration, as organizations frequently have incomplete control over the full data chain. While they can verify the configuration of the systems under their own control, they must rely on third party providers and networks elsewhere within the process. “In my experience,” says Gergely Kalman, of Toptal, “web servers and applications that have been misconfigured are way more common than those that have been configured properly.”
- XML External Entity (XXE) injection, by which malicious actors can rewrite XML requests and thus retrieve data from the server to which they would not normally have access. Although this accounted for just 4.7% of the issues logged in EdgeScan’s 2021 vulnerability statistics report, it was nonetheless classified high risk, potentially giving attackers access to application server filesystems and back-end or external systems that the application can access.
Mitigating risk of data breach
Securing an organization’s virtual assets requires a multi-layered approach, with adequate safeguards at every level. Infrastructure, databases, transport layer, API access and application security must be considered. So, too, must staff’s understanding of the risks they face and what behavior is – or isn’t – appropriate.
Where data management and handling is outsourced, as when using cloud services, a degree of security will necessarily be outsourced, too. In this case, data owners have a responsibility to their own organization – and their data subjects – to maintain close watch on the policies of their providers and ensure they are fit for purpose.
Thorough and ongoing penetration testing is essential, but far from the only requirement. Reliance on a single security or testing suite is inadequate. A thorough audit of all parties who have access to each part of a system is necessary, and each should be operating with the lowest practical level of authority.
Credentials should be set to expire at appropriate intervals (although Microsoft points out that “periodic password expiration is an ancient and obsolete mitigation of very low value”, so don’t make this the cornerstone of your security policy). Behind the scenes, monitoring and logging should be rigorous and granular so that anomalies can be quickly identified and remedied.
Ultimately, a balance must be struck between making applications – server, desktop, web or mobile – transparent, so they don’t interfere with users’ ability to achieve their goals, while also keeping the organization’s data secure.
Related Case Studies
-
01 /
Automotive Data Aggregation Using Cutting Edge Tech Tools
An award-winning automotive client whose product allows the valuation of vehicles anywhere in the world and tracks millions of price points and specification details across a large range of vehicles.
-
02 /
End To End Automated Construction Data Harvesting And Aggregation
A leading construction intelligence service provider required the continuous tracking and update of data on construction projects through automation.