Trust & AI Act self-declaration
Referential v2026.04.1Last reviewed: April 20, 2026
This page documents the transparency and compliance posture of ActLoom itself under Regulation (EU) 2024/1689. We apply to our own product the same standards we ask our customers to apply to theirs.
What ActLoom is β and is not
ActLoom is a software tool that helps organisations structure, version and time-stamp the artefacts required by Regulation (EU) 2024/1689 (the EU AI Act). It is not a law firm, not a notified body, and not a certification authority.
We sell infrastructure of proof, not proof itself. The legal interpretation of your obligations, the final review of each artefact, and any regulatory filing remain the responsibility of your organisation and its qualified advisors (DPO, legal counsel, external auditors).
AI providers used by ActLoom (Article 50 transparency)
Several features of ActLoom β guided assessments, draft FRIA content, remediation suggestions, incident report narratives β are generated with the assistance of third-party large language models. In line with Article 50 of Regulation (EU) 2024/1689, any output produced with such assistance is labelled inside the product.
We currently use the following providers. We select models that offer zero-retention or strict data-processing terms consistent with GDPR and the AI Act:
- Anthropic (Claude family)
- Used for analysis, drafting, and reasoning-heavy tasks. Processed under Anthropic's commercial terms with zero training on customer data.
- OpenAI (GPT family)
- Used as a fallback provider for resilience and for selected structured-output tasks. Processed under OpenAI's API terms with zero training on customer data.
Data residency and hosting
Customer data is stored in the European Union by default. Operational data is processed in EU regions of our infrastructure partners.
- Application hosting
- Vercel β EU region deployment (fra1 / cdg1).
- Primary database
- Neon (Postgres) β EU region.
- Tenant isolation
- Row-Level Security (RLS) policies are applied for tenant-scoped tables to prevent cross-tenant data access.
- Blob storage, KV and queue
- Vercel Blob / KV and QStash β EU region where available.
- Transactional email
- Resend β messages are transported via their infrastructure with standard contractual safeguards in place.
- LLM inference
- Anthropic and OpenAI endpoints may route outside the EU; requests do not contain direct identifiers beyond what is strictly necessary to answer the user's prompt, and are covered by DPA / SCC where applicable.
DPA and sub-processors
ActLoom offers a Data Processing Addendum (DPA) to customers and maintains a sub-processor list in this trust section.
Data is encrypted in transit (TLS) and at rest, with access controls scoped by least privilege and tenant boundaries.
- DPA
- Available for customer signature with SCC references where cross-border processing applies.
- Main sub-processors
- Vercel, Neon, Resend, Anthropic, OpenAI, Upstash/QStash.
- Primary hosting region
- European Union (Frankfurt region footprint).
- Security posture
- SOC 2 Type I planned Q4 2026. Current controls are tracked through internal security and access review checklists.
Integrity of generated artefacts
Every exported artefact (audit CSV, signed dossier, compliance report) embeds a cryptographic integrity footer. This lets an external reviewer re-verify that the content was not altered after export.
- Content hash
- SHA-256 of the canonical content plus metadata.
- Signature
- HMAC-SHA256 over the content hash using a server-held signing secret.
- Trusted timestamp
- Critical audit events are additionally time-stamped under RFC 3161 via a public TSA (FreeTSA by default, override via the TSA_URL environment variable).
- Referential version
- Each artefact records the AI Act referential version it was generated against (current: v2026.04.1).
Security and incident reporting
Security vulnerabilities, data-handling concerns, or abuse reports affecting ActLoom itself should be sent to the address below. We acknowledge reports within two business days and coordinate disclosure on a good-faith basis.
This channel is for reports about the ActLoom platform. AI-system incidents that you are required to notify under Article 73 of the AI Act must still be reported by you to your competent national authority β ActLoom does not file on your behalf.
- Security contact
- security@actloom.com
- General support
- support@actloom.com
- Acknowledgement target
- Two business days (best effort).
Our own position under the AI Act
As currently designed, ActLoom provides a software-as-a-service product. We do not place an AI system on the EU market as an AI provider in our own name; we integrate third-party general-purpose AI models that are themselves subject to Chapter V of the AI Act in the hands of their providers.
Where ActLoom's AI-assisted features generate text, summaries or recommendations shown to a user, we apply Article 50 transparency in the product UI and in the exported artefacts. If your jurisdictional analysis reaches a different conclusion for your use case, please contact us β we will engage transparently.
Limits and responsibilities you keep
We are explicit about what ActLoom does not do, so that there is no ambiguity when a regulator reviews your file.
- Not legal advice
- Any analysis or draft is indicative only and must be reviewed by a qualified human before any regulatory use.
- Not a conformity assessment body
- ActLoom does not conduct Article 43 conformity assessments and cannot issue a CE marking.
- Not an eIDAS qualified trust service provider
- Our cryptographic integrity and timestamping are strong signals but do not, by themselves, constitute a qualified electronic signature or seal.
- No filings on your behalf
- Submissions to EU databases or national authorities are made by you, using artefacts produced in ActLoom.
Updates to this page
We refresh this page whenever our providers, hosting regions, or the AI Act referential version change. The last review date is shown at the top, and meaningful changes are noted in the product changelog.