Press release

Understanding Visma’s AI Transformation

Visma CTO T. Alexander Lystad outlines how Visma is enabling AI deployment at scale across its business units.

Understanding Visma’s AI Transformation

INSIGHTS: By T. Alexander Lystad, CTO of Visma

Visma has long been at the forefront of adopting emerging technologies, and our journey with AI and large language models (LLMs) is no exception. Nearly a decade on from making our first AI investments, this technology has shaped how our approximately 180 business units build, operate, and grow. AI now connects every layer of Visma’s federated ecosystem as a driver of product excellence, operational scale, and long-term competitiveness.

Our AI-native ecosystem is operational, measured, and revenue-linked as it becomes the foundation of how every product, support and growth function operates in powering mission-critical solutions for SMBs and local governments.

Decentralisation that sparks innovation - and AI that supercharges our model:

“Visma’s federation is more than a governance model – it’s our entrepreneurial engine, driving innovation that we are now harnessing through AI.”

Visma is an ecosystem of approximately 180 business units, many still led by their original founders. This entrepreneurial core drives continuous innovation - teams test, learn, and ship independently.

When one business unit develops a successful AI solution, others quickly adapt and localise it. To harness this innovation at scale, we combine local autonomy with central guidance. At the group level, there are three types of organisations:

  • Product development teams that build AI components for integration across many products.
  • Centres of Excellence that support companies in specific domains—strategy, best practices, marketing, sales, product development, architecture, public cloud technology, and more. These Centres also facilitate peer-to-peer communities within Visma.
  • Tiger Teams, which are hands-on and operationally involved with companies for defined periods.

This structure ensures knowledge sharing, safe and compliant deployments, and prevents innovation from becoming siloed.

The impact of our AI-Native software: for our customers and developers

AI is fundamentally reshaping how software is built, delivered, and experienced, with 500+ AI initiatives across the group. For Visma, this transformation isn’t just about technology – it’s about value creation, empowering our customers to achieve more, and enabling our developers to build faster and smarter.

Visma’s AI strategy mobilises our rich data foundation for core areas of impact including AI in products (for customers), AI in growth functions (such as marketing, sales and customer support) and AI in product development. 

Our data foundation

Visma’s advantage stems from a vast customer base that has remained stable and engaged over time, and the rich, multi-source dataset it represents. This combination of trust, adoption, and longitudinal data is central to building AI that works in the real world. 

Our constantly growing proprietary datasets extends from user feedback and user behavior, to customer operational data (e.g. transactions, invoices and salary data) and  is the foundation of our AI strategy. This foundation enables smarter, personalised engagement, faster data-driven decisions, and sustainable growth—all built on deep trust and long-term relationships.

“This is what gives us our right to win: scale, data, and real adoption powering the AI-multiplier effect.”

The structure: three domains of AI-native impact

  1. AI in products (for customers)

We distinguish two streams in how AI strengthens our products for our customers:

a. Automation: We create software that does the work for customers rather than simply helping them. For example, Smartscan uses AI to extract information from financial documents, automating accounting work.

b. Advisory: Our AI leverages an understanding of our customers’ data, domain, local rules and regulations, and peer benchmarks to provide guidance on how to run and optimise their business. For example, Dinero’s Virtual CFO uses customer and peer data to advise on business optimisation.

Notably, both streams open up a new pricing logic: automation creates labour-offset value, while advisory creates profit-improving value. This enterprise-grade intelligence is now being disseminated in advisor-agent models, including Visionplanner and Netvisor.

  1. AI in support & operations

Approximately 113 business units now use AI in customer support and operations, with measurable reductions in support costs compared to non-adopters. At scale, this compounds into both improved customer experience and meaningful margin expansion.

  1. AI in product development
    AI-native engineering has accelerated projects by 1.3x to 10x. For example, our Polish business InFakt reduced a 4,200-hour app rebuild to 400 hours with AI assistance. This shift from assistance to delegation moves us up the agentic ladder, reshaping software development.

“We’re giving SMBs an advantage that normally only large enterprises have access to.”

Looking ahead

For the next 12 months, we will continue to responsibly scale our portfolio and deploy more agentic modules into production, focusing on experimentation, testing, and cross-pollination.

AI is no longer just an accelerator—it is shaping Visma’s progress, and our federation model multiplies this power, turning intelligence into scalable, transformative impact.

Contact

Lage Bøhren

Lage Bøhren

Director of Communications

Visma

Subscribe me to...