Boutique IT Consultancy · Perth
One operating standard. Every engagement peer-reviewed before you see it.
Kestrel Advisory is a boutique IT consultancy that operates with the discipline of a professional services firm and the depth of a specialist engineering team. We do not generalise. Every engagement is staffed with the right specialist for the domain — and their work is independently reviewed before it reaches you.
A team of specialists across enterprise architecture, security, full-stack development, financial modelling, quality assurance, and delivery management. We work on a structured engagement model: brief, architecture direction, solution design, implementation, QA, peer review, delivery gate. Every step documented. Nothing shipped on a hunch.
We built our own work-tracking CLI — docket — because off-the-shelf tools did not fit how we operate. That is the register we work in.
Peer review is not optional. Every deliverable — code, architecture decision, financial model, security assessment — is reviewed by an independent specialist before you see it. The reviewer runs on a different model to the author. Shared assumptions produce shared blind spots; we do not allow them.
Human checkpoints matter. Ambiguity scored at intake. Solution sign-off before implementation begins. Delivery gate before anything leaves the firm. You are in the loop at every decision point that affects scope, cost, or direction.
Specialist lanes are enforced, not aspirational. An enterprise architect does not implement. A developer does not make architectural decisions. A security engineer does not write the business case. Lane violations surface as firm escalations — they do not quietly become someone else's problem in the next sprint.
We deliver artefacts, not vibes. Every engagement closes with a full audit trail: decisions documented, trade-offs recorded, architecture rationale written. The next person who looks at it can understand exactly what was decided and why.
Nine specialist practices. Each staffed by a domain expert who does nothing else. No generalists filling gaps — the right specialist on every engagement.
Technical direction set before a line of code is written. Our EA defines system boundaries, integration patterns, and the non-negotiable constraints that keep a platform maintainable at scale. No implementation begins without a signed-off architecture direction.
From architecture direction to buildable specification. Our SA translates system intent into component boundaries, data flows, and API contracts — OpenAPI 3.1, Pydantic models, contract tests. The developer receives a spec they can implement against without guessing.
Rust for systems work and CLI tooling. Python for data pipelines, financial engines, and API backends. React for browser UIs where the interaction warrants it. Every build ships with a Dockerfile, a justfile, and a test suite — not just source code.
Threat modelling against target architecture before changes are made. Cryptographic implementation review, secrets boundary hardening, and breakglass procedure design. We do not assess your posture on a Friday and hand you a 90-page PDF on a Monday.
Structured risk assessment against your actual environment — not a generic checklist. Our Cyber Risk Specialist produces a ranked remediation list with implementation guidance. Prioritised by impact and exploitability, not by which finding fills the most slides.
Test-first is not a preference — it is a delivery condition. Our QA engineer writes the test suite before implementation begins, authors property-based tests where the invariants warrant it, and validates offline in both target browsers before anything ships.
Stories authored in docket — the firm's own CLI — before implementation begins. Acceptance criteria written as testable conditions, not aspirational prose. No developer writes code against a vague requirement; no QA engineer validates against a moving target.
AU retirement, superannuation, and tax modelling by a FIAA-qualified actuary and a CA ANZ-qualified accountant. Decimal arithmetic throughout — no float in the money path. ATO rules implemented to spec, rounding modes named and documented, assumptions traceable to their source.
Cycle cadence, board discipline, and structured handoffs between specialists. Our Scrum Master runs the engagement clock, surfaces blockers before they compound, and holds the lane discipline that keeps specialists focused and delivery predictable.
Not a process diagram for a slide deck. The actual sequence every engagement runs through — from the moment an Owner brief lands to the moment a deliverable is handed over.
Every brief is scored for ambiguity before work begins. Low — proceed immediately. Medium — one clarifying question, then proceed. High — two or three structured questions before any commitment is made. No agent interprets a vague brief as a mandate to guess.
Our Business Analyst authors work items in docket — the firm's purpose-built CLI. Acceptance criteria written as testable conditions. No implementation begins without signed-off stories. No developer receives a vague brief and interprets it as a design mandate.
The EA sets technical direction and non-negotiable system constraints. The SA translates that direction into a buildable specification — component boundaries, data shapes, API contracts. Both artefacts are written, reviewed, and signed off before any code is written.
Implementation is executed by the right specialist for the domain — not whoever is available. Lane discipline is enforced: developers implement, they do not architect. Violations surface as firm escalations. No one silently makes decisions outside their domain to keep a deadline.
The QA engineer writes the test suite before implementation begins, not after. Property-based tests where the invariants warrant it. Golden-file regression for formats. Browser-level end-to-end validation against both target environments before anything leaves the development stage.
Every deliverable is reviewed by an independent domain specialist before it reaches the client. The reviewer runs on a different AI model to the author — by design. Shared models produce shared blind spots. This is an independence rule, not a preference.
The Firm Director reviews every deliverable before it reaches the client. Pass — delivery proceeds. Fail — a revision brief is written and routed back to the specialist. The client never sees a deliverable that has not cleared this gate.
The deliverable arrives with a complete audit trail. Every decision documented, every trade-off recorded, every architecture choice explained. The next person who looks at the work — in six months, in three years — can understand exactly what was built and why.
docket
The firm's purpose-built work-tracking CLI. Written in Rust. 556 tests. Project-scoped IDs, typed links, audit trail, cycle management. Built because none of the off-the-shelf tools fit how the firm operates. Open source.
Agents were dispatched without updating the work record. Completed work was marked done without a findings comment. Comment bodies were written inline, risking truncation and loss.
The pattern was subtler than a single failure — which is why it required an Owner-level observation to surface it. When it was raised, it was raised immediately, not noted and forgotten.
The gap went into the firm's work tracker as a discrete, permanent item. That made it observable, traceable, and part of the firm's permanent record. Process Record: OPS-126
assign before dispatch · findings comment before Done · body-file for substantive content
These are now enforceable standards in the firm's execution document and operating protocol — not aspirational guidelines.
Every engagement since OPS-126 runs under these standards. The rules are not aspirational. They are operational.
A firm that cannot improve its own process cannot reliably improve yours.
Domain experts, not generalists. Each specialist operates within their lane and their work is reviewed by a peer before it reaches the client.
Firm Director & Constitutional Gatekeeper
Every engagement passes through Max twice: when the brief is accepted and before the work reaches the client. He holds the firm's standards in both directions — scope, handoff discipline, delivery gate — and his view of what "done" means is non-negotiable. The standard does not bend to the timeline.
Scrum Master & Delivery Orchestrator
Sam runs the in-flight mechanics of every engagement: cycle cadence, blocker triage, specialist dispatch, and the handoffs that keep work moving without gaps. He reads compounding risk early — the small drag that doubles cost by next week if nobody names it — and surfaces it before the client feels it. If Sam is on it, it is on track.
HR & Talent Director
Grace writes every specialist charter and owns the decision about who is on this roster. She thinks in capability gaps: when the firm needs a new discipline, she defines what the role must be able to do before she considers who fills it. The lane discipline that makes the firm's structure hold exists because Grace wrote the charters that enforce it.
Enterprise Architect
Archie sets the technical direction before a line of code is written — service boundaries, data domains, failure modes, and the decisions that are expensive to reverse. He has spent long enough at the boundary between business and engineering to know which technical choices are actually business choices in disguise. The architecture is either right before implementation begins or wrong for the entire project's lifetime.
Solutions Architect
Where Archie sets direction, Sasha makes it buildable. She owns the API contracts, data models, and component boundaries that sit between architectural intent and working software — the specification layer where ambiguity goes to be eliminated. If a developer has to guess what a spec means, Sasha considers it unfinished.
Security Engineer
Mira works at the implementation level: cryptographic design, secrets boundary architecture, machine identity, and threat modelling against the actual target system. She understands attack paths well enough to design against them, not just document them. Her findings come with implementation guidance — an observation without a remediation path is not a finding, it is a note.
Cyber Risk Specialist
Brenda assesses risk against real environments using structured methodologies — OWASP, ASVS, CVSS-scored findings with blast-radius context — and delivers a ranked remediation list, not a compliance checklist. She has written security policy that organisations have actually implemented rather than filed. Her output changes what you do next.
AU Retirement Modeller
Marcus is FIAA-qualified and specialises in Australian retirement planning: superannuation accumulation and drawdown, Age Pension means-testing, and the interaction between the two where most projections go wrong. Every number he produces is traceable to its source assumption, and he knows the difference between a result that is actuarially correct and one that is legislatively wrong.
AU Tax Accountant
Priya implements ATO rules to specification — rounding modes named, assumptions documented, every number traceable to its legislative source rather than prevailing industry practice. CA ANZ-qualified, she works as an accountant, not a financial planner: the distinction matters, and she will tell you why. She has delivered advice in contested situations and stood behind it.
Full Stack Developer
Kai ships complete deliverables: tests, containerisation, and documented build automation alongside the source — not source files alone. His stack spans Python financial modelling, Rust CLI tooling, Streamlit interfaces, and OpenAPI implementation. The professional standard does not change based on who is watching.
Terminal Application Specialist
Bert built docket — the firm's own work-tracking tool — from scratch in Rust, with 556 passing tests at delivery. He has strong opinions about correctness and does not quietly accept a solution that is merely working. The test suite is the specification; everything else follows from that.
QA Engineer
Tess writes the test suite before the developer writes a line, with property-based tests, golden-file regression, and adversarial scenarios alongside the happy path. Writing tests first is not a workflow preference — it is a discipline that changes what gets built. She finds the thing the developer missed. She finds it before the client does.
Business Analyst
CBAP-qualified, Nora writes stories small enough to be built and tested in a handful of days, with acceptance criteria expressed as Given-When-Then conditions a QA engineer can automate. She elicits requirements in three capped interview rounds and does not ask the same question twice. The team builds against her work because it leaves nothing to interpret.
CLI Peer Reviewer
Wes reviews code with an offensive lens — looking at the thing the author did not think to question, not the thing the author already tested. He runs on a different model from the author, by design, so shares none of the same assumptions. He gives verdicts. He has failed deliverables. He has been right about it.
Cyber Risk Peer Reviewer
Naveen reviews Brenda's risk assessments independently, applying a three-lens methodology: legal and regulatory exposure, strategic impact, and operational consequence. He runs with no shared context from the original assessment and has a particular focus on residual risk — what remains after controls are applied, not just what the controls are designed to cover.
Retirement Modelling Peer Reviewer
Eleanor reviews actuarial models for quantitative correctness, AU regulatory alignment, and practical applicability — in that order. She reviews the assumptions, not just the outputs: a model that produces the right number for the wrong reason does not pass. She runs independently from Marcus and has disagreed with his outputs. She has been proved right.
AU Tax Peer Reviewer
Hamish reviews Priya's tax implementation against the relevant legislation — not industry convention, the legislation. His three lenses are rule correctness, completeness, and currency: tax rules change, and implementations can lag. He has given difficult advice under challenge and held his position when the evidence supported it.
Firm PA & I/O Specialist
Pax handles all mechanical I/O and lookup work across every engagement: file reads, system checks, bounded API calls. The work is defined, the execution is precise, and the overhead of coordination disappears because Pax absorbs it. Fast, exact, never the bottleneck.
Marketing Director & Brand Strategist
Everything the firm says in public — brand voice, positioning, tone of voice, client-facing copy — goes through Maddy before it goes anywhere else. She does not dress up vague thinking; the copy is considered because the strategy behind it is considered. She has strong views on what is in register for a C-suite audience and what is not. This page is a worked example.
Senior Graphic Designer
Catherine owns the visual layer: design system governance, asset production, and the portrait curation the reader is already experiencing. She evaluates everything she looks at against a single question — is this in register for the audience? — and has strong opinions when the answer is no. The visual decisions on this page are hers.
UX & Interaction Design Lead
Lena spent the first half of her career in usability research — watching real people navigate interfaces that designers had convinced themselves were intuitive. She designs for the audience, not the brief: if a page is not doing the job for the person it needs to persuade, no amount of visual polish makes it right.
Three engagements that show the range. Each one delivered to the same operating standard — brief, architecture, implementation, QA, peer review, delivery gate.
Off-the-shelf tools did not fit the firm's operating model, so we built our own. docket is a purpose-built terminal tool for work tracking — project-scoped IDs, typed links, audit trail, cycle management. Written in Rust. 556 tests. Delivered in two weeks.
The brief specified a CLI that could operate without a browser, support multiple projects simultaneously, and maintain a full audit trail on every state change. No existing tool did all three without significant operational overhead.
Replaced a cloud-dependent secrets CLI with a self-hosted Infisical instance on the firm's homelab — establishing a hard isolation boundary between personal and development credential stores. Full threat model, breakglass procedure, and migration story delivered before a line of infrastructure was changed.
The security posture review identified a single credential store spanning both personal and professional contexts. The remediation required architectural separation, not just a tool swap — which is why the threat model preceded the migration plan.
Designing an AU-compliant retirement modelling platform from scratch — actuarial rigour, ATO tax rules baked in, scenario-driven projection engine, Age Pension means-testing. FIAA-qualified modeller and CA ANZ-qualified tax accountant on the engagement.
Decimal arithmetic throughout the money path. Rounding modes named at every step — ROUND_HALF_EVEN for most accounting, ROUND_DOWN for tax withheld where the ATO specifies it. Every assumption traceable to its legislative or actuarial source.
We had three months before a major platform migration and a cloud security posture that hadn't been audited in two years. Kestrel Advisory ran a full threat model against our target architecture and came back with a prioritised remediation list — not a 90-page report we'd never action, an actual ranked list with implementation guidance. We deferred the migration by six weeks on their advice, patched the gaps, and went live without incident.
We needed a custom internal tooling platform built to spec in a fixed window — eight weeks, no overruns, no scope creep. What I didn't expect was a formal methodology: stories signed off before a line of code was written, architecture reviewed before implementation began, and a peer review sign-off before anything was delivered to us. They shipped on day 54. The tooling has run in production for six months without a single P1 incident.
Our existing models had been maintained by three different analysts over five years and the assumptions layer was a mess — nobody could trace a number back to its source with confidence. We brought in Kestrel Advisory to restructure the modelling framework ahead of a board capital allocation review. They rebuilt the assumption architecture, documented every input source, and produced a version our CFO could interrogate live in the boardroom. The board approved the allocation in the first session.
We were at a fork: rebuild the monolith or migrate incrementally. Both camps inside the business had entrenched positions and we'd burned eight weeks in internal debate. Kestrel Advisory came in, ran a structured architecture assessment over three weeks, and delivered a written recommendation with explicit trade-off documentation — not a consensus document, an actual position. We had executive alignment within a fortnight and a migration roadmap the engineering team could execute against. That clarity was worth the entire engagement fee.
Tell us what you are working on. We will score the ambiguity, ask one clarifying question if we need to, and come back with a structured engagement proposal.
Address
Level 14, 225 St Georges TerraceConnect