The Questions We Didn't Answer: Reflections on Two Decades of Technology Development Funding
Musings on two decades of technology grants and what they reveal about policy, evidence, and democratic participation
Last week we launched our report "What are we funding: Analysis of 2004–2024 funding from Technology Development Fund" (Northstack, 2025), examining how two decades of grants from Iceland's Technology Development Fund have been allocated across technologies and industries. The response has been gratifying, from thoughtful comments to detailed media coverage. But now that the data is out there, we want to step back and share some broader reflections on what these patterns might mean and why they matter.
The Questions We Didn't Answer (Yet)
In our report, we deliberately focused on the "hard facts," mapping where nearly two decades of public innovation funding has actually gone. We highlighted what seemed remarkable or surprising to us, but largely avoided the normative questions that our findings inevitably raise.
Now it's time to wrestle with those harder questions.
Is this allocation desirable? The dominance of software projects isn't inherently good or bad, but it does represent a choice, even if an inadvertent one. When over 60% of the country’s direct R&D funding supports the development of a single (albeit broad) technology category, we're effectively making a bet about Iceland's economic future. The question is whether that bet aligns with our national priorities and comparative advantages.
The decline in ocean-related tech funding is particularly striking given Iceland's maritime heritage and the global push toward sustainable ocean technologies. Are we missing opportunities in areas where we have natural advantages? Or are we simply following where the entrepreneurial energy leads?
Why do we see these patterns? The evidence points to supply-side dynamics rather than deliberate policy choices. Anecdotally, this isn't about assessor preferences; rather, it reflects what kinds of projects get proposed in the first place.
Two factors seem particularly important here. First, many stakeholders believe that the timeline and amounts of most TDF grant types naturally favor software development over hardware-intensive sectors. Generally speaking, building an AI application requires different resources and timescales than developing marine technology or pharmaceuticals.
Second, and more fundamentally, without an active government industrial policy that identifies and heavily invests in key emerging technologies, it’s unlikely we’ll see the emergence of deep tech sectors. Software can thrive in a relatively low-infrastructure environment. More capital-intensive technologies need coordinated support across multiple policy areas; research infrastructure, innovation funding, skills development etc. working in concert.
The Evidence Gap
This brings us to our second, perhaps more important point: we need more analysis like this.
Independent, quantitative analysis based on open data, shared transparently, serves a different function than expert commentary, in-house government analysis, or research commissioned by interest groups. It doesn't replace these other forms of knowledge, but it's not replaceable by them either.
The government's draft bill consolidating the existing research and innovation funds (Frumvarp til laga um opinberan stuðning við vísindi og nýsköpun) illustrates this perfectly. The proposed restructure (creating a Science Fund, Innovation Fund, and Challenge Fund) represents a significant strategic and administrative overhaul. The bill refers to reviews and evaluations that informed this approach, but doesn't publish their detailed findings or methodology. It cites international best practice without performing direct comparative benchmarking. It promises three-yearly impact evaluations (a welcome development!), but does not mention how those evaluations will be commissioned and what standards of evidence will be expected.
The consolidation logic is compelling in principle and aligns with many of the recommendations emerging from our report. The proposed Challenge Fund is particularly intriguing as it hints at the possibility of movement beyond traditional grants toward alternative funding mechanisms. Different technologies and sectors might benefit from approaches like procurement programs that create markets for innovative solutions, loan guarantee schemes that de-risk private investment, or challenge prizes that incentivize specific outcomes. But without details on how these new funds will operate differently from existing programs, it's hard to assess whether they represent genuine innovation in funding mechanisms or just administrative reshuffling.
These aren't necessarily flaws in the bill: policy documents serve different purposes than research reports. But they highlight a broader pattern: policy is being made with reference to evidence that remains largely invisible to public scrutiny.
Democracy Needs Data
This matters for democratic participation. Meaningful public consultation requires good, clear, accessible, and unbiased evidence. Citizens (and in our case, entrepreneurs) can't engage thoughtfully with policy choices if they can’t interrogate what results past policies have produced.
The new government plans represent a genuinely exciting step forward in thinking systematically about innovation policy. The emphasis on industrial strategy, the recognition that different innovation policy goals require different support mechanisms, the commitment to regular evaluation – these are all welcome developments.
But realizing this vision demands a cultural shift toward transparency, evidence-based decision-making, and public accountability that goes beyond the innovation funding system itself.
Looking Forward
The funding patterns we documented in our report did not result from conscious top-down planning. Rather, they emerged from the accumulated decisions of individual applicants, assessors, and administrators operating within existing institutional frameworks. There's nothing wrong with bottom-up, supply-driven allocation, but it shouldn't be the only factor shaping where public resources go.
The question now is whether Iceland can develop the institutional capacity and political commitment needed for a more strategic approach. This means not just setting priorities and reorganizing funds, but creating the conditions (infrastructure, skills, funding mechanisms, regulatory framework) that enable those priorities to be realized.
It also means building systems for ongoing learning and adaptation. Innovation policy happens in a world of fundamental uncertainty. We can't know in advance which technologies will prove transformative or which approaches will work best. What we can do is create robust systems for generating evidence, learning from experience, and adjusting course when needed.
That's why independent analysis matters. That's why transparency matters. And that's why we'll keep doing this work: because democracy works better when citizens have the information they need to hold their governments accountable.
What do you think? Are we asking the right questions about Iceland's innovation funding? What other patterns would you like to see analyzed? Let us know in the comments.