AI in Social Reporting

AI in Social Reporting
AI in Social Reporting

What Institutions in the Social Sector Must Know and Decide

AI adoption in Switzerland is relatively advanced; the social sector, however, lags behind. This delay has many reasons, among them Swiss thoughtfulness, yet the situation in the social sector is particularly challenging and insecurity is high. In what follows, I’ll describe current and potential use cases of AI in the Swiss social sector, and I’ll explain technical, linguistic, and legal conditions that must be met. My aim is to illustrate what rules and tasks text-based reporting tools must accomplish in a protective governmental environment. I will end by presenting our own solution to this situation, the “Berichte-Assistent.”

The Swiss Social Sector and Its Complex Reporting

The Swiss social sector spans many kinds of institution, like administrative bodies, privately-owned SME, and charities or foundations. They support clients in difficulty with customized programs, such as labor integration, sociopedagogical family support, sheltered workshops, or reintegration after law inforcement.

Digitalization in this area usually covers the use of case management software. Such software makes no use of AI technologies so far, and the question of if and how AI technologies can add value to the social sector is emerging.

Reporting in the social sector means that case management data must be brought into text form. Specialists in the Swiss social sector dream of getting AI support in their reporting because social reporting is costly. Reports must meet high standards: they must be complete, true, intelligible, and unbiased, and they are written individually by specialists like custodians, special education workers, job coaches, and special education trainers. And they usually take too much time.

Report writing challenges specialists because most of them haven’t received any reporting training. On top, there’s myriad of different kinds of report, ranging from notebooks to final reports and from assessments to accounting. To make things worse, societal trends regarding correct language use add  further challenges.

Social reporting thus requires specialists to constantly decide what exactly should go into the report, and how it should be worded. They often lack answers and guidelines. This insecurity also hampers the development of proper technical solutions.

Each Institution Needs a Concept

In addition, data protection causes uncertainty. Many specialists know that AI-driven tools bear risks, even if they cannot fully calculate them.

This is why many social institutions in Switzerland are unsettled. Resources are scarce and push leaders to look for technical ways to automate their reporting, while the unregulated use of AI-driven chatbots, so called shadow AI, is spreading.

Social workers and other specialists shouldn’t be left alone with this. Institutions must develop a concept that determines – even if only preliminarily, at first – how and which AI tools may be used.

Yet, what is even meant by “AI” in the Swiss social sector? Switzerland is far away from Silicon Valley.

Which AI Technology Is Relevant for Reporting in the Social Sector?

In Switzerland, it’s common to hear the word for “AI” used for chatbots, so it’s important to clarify what technology is meant. For social reporting, the technology we’re considering is Generative AI (GenAI). GenAI is the technology that generates a product, e.g. text, audiofiles, or images. Text is generated based on Large-Language-Models (LLM), which apply algorithms based on linguistic rules and statistical probabilities.

LLM differ regarding performance and data processing. Closed-Source-Models are only accessible for internal developers behind closed API, while Open-Source-Models can be viewed, downloaded, and modified from the outside.

For the social sector to use LLM in reporting, this difference is crucial because of data protection issues. In the Swiss social sector, only applications built on Open-Source-Models can be used legally and safely.

How Is Data Protection Relevant for the Social Sector in Switzerland?

Apart from technical risks, data protection is the number one hurdle to use AI technologies in the Swiss social sector. Because all data managed in the social sector are sensitive personal data and protected by law; they belong to people in a situation of vulnerability.

Each institution has the explicit permission to process relevant data. If data are transmitted to the commissioning entity, or if a third party will have access or receive the data, then all institutions need the same legal permission, and they must operate under the same legal conditions. This also applies to providers of technical solutions. And most AI-providers of Closed-Source-Models are based outside Switzerland.

If a software provider is based abroad, it’s important whether they’re based in a country with similar legal conditions; for Switzerland, this generally applies to EU- and EFTA-based companies. For software providers elsewhere, including the US, the law is a game stopper. The Conference of Swiss Data Protection Officers says that “outsourcing sensitive personal data […] in SaaS solutions to large international providers is […] impermissible in most cases.”

This decision was based on several factors, i.e. the lack of transparency in data processing, the US Cloud Act, and the option for providers to change contracts one-sidedly. This condition also applies to most Closed-Source-Model-based chatbots, such as CoPilot, ChatGPT, Gemini, Claude, or Perplexity.

Conversational Chatbots Are No Reporting Solution

Given these limiting conditions, software based on Closed-Source-Models isn’t a legitimate reporting option in the Swiss social sector. Freely available tools don’t meet the requirements regarding data protection and privacy.

It is also forbidden to anonymize reports by simply removing personal names. Given that reports in the social sector contain many specific data points, AI systems can easily connect the dots and assign data points to people. AI-chatbots must never be fed with any personal data to generate a report.

The same goes for audio recordings and translation tools.

What is more, chatbots don’t comply with the quality standards in social reporting. Without specialized prompt engineering and post-generation revision, the generated text doesn’t meet the requirements. This entails that productivity is reduced, no time is saved, and no automation is achieved. The user must compensate for the shortcomings and ends up saving no resources.

Sector-Specific Solutions

These challenges, however, don’t imply that it’s impossible to use AI in the social sector. To the contrary: new solutions are currently in the making. They include customized software and chatbots. They cover different aspects of report writing, such as case management software, text-generation tools for specific report types, and research tools, especially in labor integration.

Each Institution Needs to Find Its Own Solution

What solution works best depends on the needs. Institutions must clarify what they attempt to achieve: save time, optimize processes, or standardize reports. Depending on their goal, a different application will be best. There’s no one-tool-fits-all solution.

The following conditions are crucial to establish a lasting solution:

  • Development and decisions are managed by operations, not IT; IT will implement.

  • The tool incorporates operative, linguistic, and technical expertise.

  • Requirements, such as time saving or report quality, are clarified.

  • Privacy by Design: Data protection is considered from the very beginning.

  • The model won’t be used for further training.

  • Ideally, servers are located in Switzerland and operated by Swiss providers.

Our own SaaS is the Berichte-Assistent, which translates as ‘Reporting Assistant’. It supports specialists by guiding them through the process, alleviating the mental workload, and generating reports that are consistent in content and language. It thus enables standardization on a high level. Data protection is 100% ensured.

In a nutshell, current challenges in the Swiss social sector show that text-based reporting can be automated even in a protective environment. We’re proud to contribute a solution.

Share this Post!

About the Author

Consultant | Speaker | Author

Danae is a versatile communications expert with extensive experience in research and the industry and an extraordinary affinity for people and languages. She holds a PhD in linguistics and has published her research with the most renowned editorial houses. She’s been enabling communication for corporate, public, and private clients from an array of different sectors and cultures for two decades. She has the ability to grasp the essence of a message, put it into the right words, and transmit it at eye level.

Subscribe to our Newsletter!

Receive regular expert articles and program updates.
2026-04-20T20:08:14+02:00

Impressum

  • Anaphora GmbH
    Birchstrasse 1b
    8542 Wiesendangen
    Switzerland

  • UID CHE-473.899.190
  • +41 41 711 44 66
  • Monday – Friday:
    8:00 – 12:00 | 13:30 – 17:00

Quick Links

Recent Posts

Go to Top