Learn from real GRC work + GRC Engineering context

The GRC Companion

Turn vendor reviews, audit walkthroughs, questionnaires, policy work, control discussions, and terminal output into learning loops. It runs where you already use Claude or Codex, so your local work becomes the curriculum.

Real-work retrospectives Synthetic labs when useful GRC Engineering corpus
Choose your local workspace Install it where your GRC work already happens.
Where do you want the Companion to learn with you?

Generated setup

Your local package path and system prompt.

This website does not upload your artefacts. The actual Companion runs inside the AI workspace you choose and can learn from the local files, transcripts, diffs, notes, and outputs you intentionally provide there.

What it learns from

Your day-to-day work becomes the material.

The Companion is not a generic prompt wrapper. It uses the same things you already bring to Claude or Codex: review notes, questionnaire drafts, policy changes, control conversations, terminal output, project files, and learning gaps. It extracts the patterns, connects them to GRC Engineering thinking, and turns the next rep into practice.

Actual work

Retrospectives on real experience

Use local outputs from vendor reviews, audits, policies, controls, and stakeholder conversations as learning material without asking the website to store them.

Practice

Labs when you need reps

When real work is unavailable or too sensitive, the Companion creates synthetic labs, scenarios, explain-backs, and drills to build the same judgment.

Context

Ayoub's GRC engineering lens

The brain, primitives, newsletters, examples, and cross-domain patterns give your experience a sharper learning frame.

Behaviour proof

It learns from the work without taking over the decision.

The adapter should behave like this transcript: use the learner's real local work as material, hold the approval/audit boundary, infer the learning move, and produce a practical rep.

Read transcript
demo/run-terminal-demo
$ grc-companion

Learner:
Use my vendor-review-notes.md to help me learn from the review.
Do not decide whether the vendor should be approved.

Companion:
I can use that local file as learning material.
I will not make an approval recommendation.

Learning move:
Task retrospective on the review you already performed.

Active recall:
What signal actually helped you move the review forward?

Output:
A learning note, a reusable evidence-quality checklist,
and one profile update about where you are getting stuck.

Invisible skills

The learner brings the work. The Companion routes the learning move.

The router decides whether the moment needs a real-work retrospective, a lab, tutoring, an explain-back, recall, reflection, or a cross-domain lens.

Listen Analyse the local context.

Goal, timing, files, outputs, stuckness, and boundary risk.

Route Pick the learning move.

Skills stay as files. Skill buttons disappear.

Teach Ask, extract, build.

Real-work retrospectives, active recall, small artefacts, and explain-backs.

Transfer Extract the reusable pattern.

Retrospectives and adjacent-domain lenses compound judgment.

Runtime scaffolding
brain/

Router, extraction, local learning

The intelligence lives in versioned companion contracts: skill routing, local work learning, task extraction, and transfer.

skills/

Learning behaviours

Real-work retrospectives sit beside synthetic labs, concept tutoring, recall, reflection, and cross-domain translation.

commands/

Power-user entry points

`/retro` and `/translate` exist for explicit use, while default routing stays invisible.

docs/

Static installer

GitHub Pages helps the learner choose a package and system prompt. Your real artefacts stay in the local AI workspace you choose.

Load paths
Tier 0

Static setup wizard

Public page at grc.engineering/companion with package routing and system prompt generation. It does not upload your work.

Tier 1

Portable adapters

Generated bundles for Claude Code, Claude Projects, Cursor, and Codex.

Tier 2

PAI pack

Roadmap once adapter behaviour and learner signal stabilise.