Crystal Intelligence and its partners, BCB Group, Bitstamp, and CMS, held this webinar to inform government policymakers, regulators, compliance and law enforcement officers, investigators, and exchanges about the intersection of artificial intelligence (AI) with the crypto industry.
This included discussing the regulatory, technical, practical, ethical, and even environmental issues related to how the crypto industry deploys AI and the risks and opportunities for industry role-players.
AI in crypto regulation and compliance: the panel
The panel was moderated by BCB Group’s Head of Compliance, Kym Routledge, and consisted of:
- Kamran Choudhary, Head of Compliance at BCB Group.
- Charlie Kerrigan, Lawyer specializing in finance and technology and partner at CMS.
- Charlotte Baker, Head of UK Compliance & Regional Policy at Bitstamp.
- Nicholas Smart, VP of Intelligence at Crystal.
How AI is currently being leveraged in the RegTech field
Nick Smart explained how Crystal Intelligence currently uses AI to process and analyze the massive transaction volumes of blockchains. This is done firstly for detecting patterns such as address clustering, which it can do more effectively, efficiently and on an infinitely larger scale than humans.
It is also used to detect anomalies that could present risks to clients, such as so-called wash trading, market manipulation and surveillance, or other unusual transactional activities indicative of potential crime.
Ultimately, using AI to process massive transaction volumes or detect glitches enables human intelligence to come into play. It is useful to note that replicating AI’s functionality with human labor could come at a high cost in time, expenses, and resources.
The challenges of integrating AI into crypto firms’ financial control frameworks
1. All financial firms should pinpoint where change is needed to address challenges such as processing large data volumes or issues with fraud detection. Crypto firms should then perform rigorous risk assessments of areas of their work that they are seeking to change with AI to speed up implementation.
2. From a regulatory point of view, AI can be used to guide businesses on what changes they should make. It is important to note that AI is deployed to support and enhance existing compliance frameworks, not replace human decision-making.
The latter would raise concerns about what would happen if AI made errors which could in turn adversely impact businesses and customers. The panel explained a hypothetical example, where AI might not be able to understand the concept of fairness, and therefore unduly target certain categories of accounts or account holders.
Ironically, such errors could create more work for humans, as they then need to correct the AI’s mistakes.
The operational challenges of implementing AI-powered compliance tools within crypto firms
AI can be highly influential in the crypto space, dealing as it does in massive tranches of data, of which there are two kinds, off-chain, and on-chain:
- Off-chain data consists of elements such as KYC records, customer risk profiles, transaction monitoring records, and legal information, all of which create a holistic view of each customer.
- On-chain data relates to information transmitted and recorded on blockchains. Different blockchain networks operate differently (e.g., Bitcoin transactions versus Ethereum transactions), which means analyzing them requires a smorgasbord of specialist tools, such as those Crystal offers.
Fusing both data sets is then necessary in compliance, a complex analytical interplay. To accurately perform this on such a scale, crypto firms must thoroughly understand their data and operational requirements before seeking AI assistance.
The crypto industry is dynamic and rapidly growing, which adds another challenge for keeping AI models up to date. To extract the best results from AI in the ever-changing crypto landscape, these models must be trained frequently by competent humans who understand how to formulate the right questions. This in turn means that the questions which humans ask AI tools must also improve to improve the quality of the answers.
The challenges of aligning AI-powered compliance tools with legislation in crypto
The relative nascence of the crypto industry arguably gives it an advantage in following AI legislation over traditional and often culturally conservative financial institutions. The crypto sector already incorporates AI into its framework, making it more agile and more adaptable. Nonetheless, differing regulatory jurisdictions also remain a challenge for both sectors, though for different reasons.
AI legislation must conform to two general data regulation principles:
- There are data privacy rules for the broader financial sector, which include general data protection regulations (GDPR) and govern both personal and professional data.
- Then there are AI-specific regulations, for example, those set out in the EU Artificial Intelligence Act (EU AI Act).
The EU AI Act has three expectations of firms incorporating AI tools into their work:
- Identifying AI systems within the organization.
- Performing risk classifications of those systems.
- Keeping records of AI systems’ usage which explain and justify decision-making processes.
A general legal recommendation for firms adopting AI systems is to define and implement policies that govern the impacted operations.
The challenges of aligning AI-powered compliance tools with regulation in crypto
Regulators understandably adopt a cautious approach to AI tools, believing that overdependence on AI solutions runs the operational risk of having no contingency plans in place if an entirely automated system collapses. One way to remedy this would be to deploy diagnostic tools that decode how an AI tool reaches its conclusions before they are escalated to skilled human resources who can act on the findings.
Global standardization of AI usage protocols could also resolve concerns about crypto exchanges emigrating their business to more accommodating jurisdictions, which would stifle the industry’s development in, for example, the EU, where MiCA is now in full force.
Meanwhile, regulators have taken a more wait-and-see approach rather than a hardline one to how crypto firms apply AI. It was also noted that regulators use AI for certain purposes that are not necessarily crypto-related. The absence of universal, harmonized benchmark measurements should not stop crypto firms from evaluating the possible regulatory risks of their planned AI implementations.
How AI in the crypto sector can influence ESG
There is a division regarding the impact of AI on environmental, social, and governance (ESG), which measures companies’ ethical and sustainability profiles, both within and beyond the crypto industry.
One example of the negative influence AI can have been an adverse environmental impact: The International Energy Agency 2024 report estimated that AI data machine centers use more energy than crypto mining. They are also understood to be using increasing amounts of water to cool their hard-working machines down. On the other hand, AI research could be deployed to optimize energy usage as demand increases but is disproportionately distributed on a local level, possibly enhancing projects like what the US Department of Energy is currently doing.
Crystal noted a similar contradiction in ESG compliance involving its deployment of AI tools. While it uses an AI data center, potentially contributing to energy waste, it also embraces remote-first work concepts, which could offset energy consumed by commuting and driving. However, Crystal’s core work, combating financial crime, represents a social benefit since it counters money laundering and associated activities which ultimately inflict a multitude of social harms on society.
Is AI a risk to the livelihood of crypto compliance professionals?
The consensus of the panel was that it is not. AI is a force multiplier deployed in three consecutive stages:
- Driving efficiency: AI enhances the ability to process massive amounts of data quickly.
- Human decision-making: A seasoned, experienced professional is essential to make appropriate decisions about the AI outputs.
- Human-led governance: This ensures transparency and accountability in any investigation.
It was also pointed out that, from a regulatory perspective, current frameworks ultimately require a human to be held accountable in any circumstance, whether a legal one or to pay fines, though the issue of job displacement by AI, as with previous revolutionary technologies, was acknowledged.
Is AI key to scaling a business while ensuring compliance?
Essentially, it is critical to make an accurate assessment of whether AI tools should be deployed at all, based on the scale and complexity of the problem, before using them. However, it was noted that once deployed, AI tools can massively enhance productivity with fewer human resources, although not at the expense of human oversight.
Key takeaways about AI in crypto regulation and compliance
- The advantages of implementing AI solutions in crypto applications far outweigh the risks.
- The deployment of AI tools in the crypto industry enhances the work of human resources, it does not replace it.
- Clarity on why and how AI tools are to be deployed is critical for all industry role-players before they decide on doing so.
- The advantages of implementing AI solutions in crypto applications far outweigh the risks.
- The deployment of AI tools in the crypto industry enhances the work of human resources, it does not replace it.
- Clarity on why and how AI tools are to be deployed is critical for all industry role-players before they decide on doing so.