The history of the threat landscape has for the past 20 years or more been a story of attackers versus defenders; an innovation arms race with no long-term victor. The question now facing researchers looking to head off the worst excesses of the cybercrime underground is whether AI and machine learning will finally give black hats the edge. While most security vendors today are tapping these emerging technologies to help better detect malware and spot malicious behaviour, there are a wealth of new opportunities on the other side to help the bad guys circumvent traditional defences and craft more cunning attacks.

Spear-phishing at scale
Targeted attacks have until now by their very nature been limited to a small number of users in an organisation. It takes time for an attacker to conduct reconnaissance, understand the target, how they work and what they're likely to click on. AI and machine learning, in combination with other tools, offer the possibility of being able to do that on a massive scale.

Tools could be developed to ingest LinkedIn and other third-party data en masse, run predictive analytics on it and then automate widespread spear-phishing campaigns, with each target receiving a different email depending on their profile. Trend Micro currently uses AI in its Writing Style DNA feature to help understand how users compose emails in order to prevent spoofing attacks such as BEC. But the same kind of AI tech could theoretically be used by cyber-criminals to better mimic specific users to trick targets into clicking.

Hackers could even use web browsing and other data to profile and predict individuals' online behaviour, in order to insert malware or phishing emails at the perfect opportunity. If they know User A visits specific e-commerce sites in their lunchtime, for example, they may drop a phishing e-mail into their inbox at 11.50am, offering a discount at a said retailer if they click through.

When attack campaigns get this hard to spot it becomes a nightmare to secure against, both from a technological perspective and in terms of training end users.

Under the radar
This ability to monitor behaviour and spot patterns humans can't, will also give hackers an advantage in helping them to carry out data theft, whilst staying hidden from the latest security technologies. At present, vendor tools monitor for various types of unusual activity which might indicate a threat. But what if the bad guys are able to hide their activity so they don't appear unusual at all? Then it's like trying to find the proverbial needle in the haystack.

Monitoring enterprise data flows could reveal the perfect time for exfiltration of a trove of stolen data, for example. Or analysing and predicting when Windows updates occur could offer an ideal opportunity to take advantage of system downtime to install malware or move laterally inside networks. If modern security is all about looking for anomalies, what happens when there are none? The attackers are effectively hiding in plain sight.

Fortunately, the technology is not quite there at the moment. Yes, there are cloud-based services which cybercrime groups can rent for scalable storage and compute power at relatively modest prices. There are even predictive analytics capabilities, but they may have problems getting the large volumes of data required to build these models from inside the target organisation. Of course, when it comes to external sources there's no such problem.

Joining the dots
However, once the black hats manage to join the dots to and close these technology gaps, there could be trouble ahead. Malicious AI tools will inevitably make their way onto the "as-a-service" cybercrime underground where they'll be democratised to the masses just as ransomware, exploit kits and banking trojans were before them.

AI could even be used in the future to help spoof video and audio of business leaders, politicians and others. Imagine an incoming BEC-style attack, but this time instead of an email it's a FaceTime call in which the CEO directs finance teams 'in person' to transfer money out of the company. Or consider highly damaging spoofed footage of a presidential candidate emerging hours before an election.

The cybersecurity arms race has been a constant for many years, but with the emergence of AI and machine learning the stakes are higher than they've ever been. For the vendor community, the only way to stay on top is to cast the net as wide as possible on forward-looking threat research and never give up.


Copyright © 2019 Khaleej Times. All Rights Reserved. Provided by SyndiGate Media Inc. (Syndigate.info).
 
Disclaimer: The content of this article is syndicated or provided to this website from an external third party provider. We are not responsible for, and do not control, such external websites, entities, applications or media publishers. The body of the text is provided on an “as is” and “as available” basis and has not been edited in any way. Neither we nor our affiliates guarantee the accuracy of or endorse the views or opinions expressed in this article. Read our full disclaimer policy here.