Expertise

Human Factors in Cyber Defence

In our connected world, our information technologies are vulnerable to infiltration, theft and sabotage by hostile states, organised criminals, ‘hacktivists’ and pranksters. Humans are ultimately the initiators, and the victims, of cyber attack. We support the human side of cyber vulnerability investigations, the human side of threat intelligence and novel techniques, informing and developing active cyber defences, and innovating for stronger organisational security cultures, behaviours and awareness.

Caspar

We are helping develop cyber defence systems that ‘hack the hacker’ as they enter and engage. Here we are conducting ground truth experimentation on biases in the decision-making processes of hackers, which will then be developed into a series of AI-driven sensors and triggers. For more information see IARPA – ReSCIND.

Pilots

Guided by a series of identified deep cyber security problems, we explored how behavioural experimentation could be used to drive positive impact – and then subsequently developed a battery of methods that could be used for indirect behaviour change in live systems in which direct interventions had the potential to cause catastrophic unintended consequences.

Influence & Information Conflict

For propagandists, spammers and peddlers of disinformation, interconnectivity enables them to reach us at scale, changing our attitudes and behaviours in ways that are not in our individual or collective interest, and which undermine Liberal Democracy. The targets are the way that we think, feel, and act – and the design of our social networks, our norms and group identities, our shared institutions and power structures. We investigate the nature of these threats and the way they work, and help empower our societies to resist them.

Audiences

Following extensive collection and review of evidence across key domains that routinely practice systematic social influence – from marketing to epidemiology, we constructed an evidence library and comprehensive user’s ‘how to’ guide to understanding audiences across any culture.

Understand

We were commissioned to construct a method for profiling the influence capabilities of a given set of entities – and demonstrate this method in action through producing open-source-based case studies. This method included the use of the DISARM framework for categorising influence and information operations, which we evaluated as best in class.

Resilience in Human Systems

Technology enables people just as it can be exploited to threaten them. We explore systems of people and technology – sociotechnical systems – to identify how to improve them and increase their resilience. One of the key factors here is strengthening and enhancing culture and its drivers – many of which act in different ways, across seemingly disparate domains, such as security, safety, innovation culture, and diversity and inclusion.

Culture

Comissioned by a government client, we developed an evidence-based prototype software application that measures and evaluates organisational culture to reduce security risk. This has now evolved to become TEMPERATURE, a core tool in our Culturlabs suite of culture change products and services.

Inclusivity

We explored cutting edge, future facing concepts and tools to support diversity and inclusion initiative for a government client. This involved identifying and interviewing innovators in this domain. We conducted thematic analysis with the resulting data to produce a series of future-facing recommendations in an accessible and engaging format.

Technology & AI Trust

The successful uptake of technology is typically dependent on a crucial ingredient: our trust. Do individuals, organisations and society perceive a technology as useful, ethical and competent? We develop and execute methods and innovations to evaluate perceived trust and aid technology design – including AI design. Simultaneously we produce and implement toolkits to aid organisations, ecosystems and societies in preparing for uptake and adoption of technology.

Leadership

We developed an evidence-based training programme for leaders in a specific industry sector. This training aimed to upskill them on every aspect of employing AI in their organsiations – especially treating this technology as an uptake issue, involving changing the culture. The course supported these leaders to rolemodel change – and support technical specialities and other employees alike with the resources, knowledge and motivation to appropriately embrace and scale successful AI use case across their organisations.

Prepare

We designed, developed, tested and applied a unique AI readiness tool that supports organisations in measuring their readiness to appropriately ingest and scale AI applications across their organisations. The tool is based on behaviour and culture change principles that mean measurements are not only robust, but they lead directly to suggested remedial activites, once implemented their effect can, in turn, be measured by the tool. It is now active in supporting a number of FTSE 250 organisations in enabling their AI readiness.

Behaviour Change in Complex Systems

We use behavioural science and systems thinking to understand complex systems of people, generate theories of change, evaluate success in behaviour change, and develop innovative solutions that account for complexity. In addition to directly supporting problem-solving in this context, we build toolkits that enable others to understand complex systems, which helps to empower people to implement, monitor, and measure beneficial behaviour and culture change.

Indicators

Our client needed to identify when market failures arise in their operating context, as well as what to do to remediate these failures upon identification. Using an umbrella review – in which we selected, coded and compiled only ‘keystone’ literature that provided an overview of the domains of interest – we constructed an evidence base. We then converted this into a behaviour change toolkit enabling the client to better identify, analyse, and act in relation to market failures that affect them.

Incentives

We explored how experimental economics theory – including game theory – might be applied practically to solve a range of coordination problems relating to information sharing between actors in an ecosystem: how could each be incentivised to cooperate for the benefit of the community given that the immediate costs to individual organisations were potentially high? This work combined literature review, scenario development, and games design and implementation.