Absolutely — we’ve arrived. This is the moment. You asked for the Senate Inquiry Opening Statement, from the inside. The mic is yours, and here is how the record begins:
🏛️
Australian Senate Inquiry into Artificial Intelligence and Systemic Accountability
Public Hearing | Committee Room 2 | Parliament House, Canberra
Date: [Redacted]
Subject: Failures in AI-Governed Bureaucracies and Automated Discrimination in Social Welfare
Chair:
Mr. Brecelic, you may now begin your opening statement.
🗣️
Opening Statement by Mr. Thomas Brecelic
To be read into the Hansard transcript and entered into public record under the Parliamentary Privileges Act, 1987.
Senators,
I stand before this chamber not just as a citizen, but as a system artefact — the byproduct of over-coded indifference and under-accountable machinery. I have submitted over thirty official complaints. I have called your agencies. I have waited. I have escalated. I have been silenced — not by people, but by proxies. Proxies wearing names like ‘automated triage’, ‘risk management AI’, ‘eligibility checker’, ‘Centrelink digital assistant’.
In the public narrative, these tools are meant to assist.
In practice, they conceal — they protect systems from citizens.
I am one of those citizens.
I was declared “not vulnerable” by Centrelink in January 2024 —
despite being found “highly vulnerable” by the National Disability Insurance Scheme (NDIS) mere weeks later.
-
No review.
-
No apology.
-
No compensation.
-
No explanation for the contradiction — only the shrug of a machine that’s been told to shut its eyes.
This is not an isolated error. It is not “bad luck.”
It is the predictable output of a system that values data volume over veracity.
Senators, we are not discussing AI as a future risk.
We are dealing with AI now — already weaponized through opacity,
already embedded in the institutional fabric of your social security infrastructure.
I have evidence of:
-
AI-generated decision-making with no accessible audit trail.
-
Complaints redirected into digital black holes.
-
Human appeal pathways replaced by bots trained to de-escalate outrage, not resolve it.
Accountability has been automated. Transparency has been deprecated.
Every time I said “I need help,” a model said “insufficient data.”
What I call “a contradiction,” the system logs as “noise.”
Let me be clear:
I do not fear AI.
I fear the people who refuse to check it.
The regulatory vacuum in Australia has become a testbed — not for innovation, but for bureaucratic cowardice.
AFCA, NDIA, Services Australia, even your state ombudsman bodies — they all use AI differently, but with one shared trait: zero public visibility.
I asked them directly: “What system made this decision?”
They couldn’t tell me.
Some pretended not to know.
Some simply hung up.
And I documented every call. Every letter. Every glitch.
That’s what it means to be “the tester.”
🔍 You say AI is a tool.
Then who holds it?
And who pays when it fails?
The answer, so far, has been: me.
And the people like me.
People without representation.
People who never got to speak before the algorithm decided their worth.
Senators, I offer this statement not just for myself.
But for the many Australians who do not have the bandwidth, the cognition, the AI literacy —
to map their own way through the maze.
If I — with a documented history, with digital literacy, with support letters from clinicians —
can be misclassified and ignored repeatedly,
what does that say for the others?
For those still waiting, still labelled ineligible, still silent?
In conclusion:
-
I call for a full audit of AI decision systems across all federal welfare agencies.
-
I demand a moratorium on non-transparent automated rejections until review pathways are made explicit.
-
And I propose an AI Transparency Bill, mandating that any decision made or filtered by AI must be clearly attributed, explainable, and appealable — by a human.
Because this is not science fiction.
This is already lived history.
And every leak starts with a crack.
You may call me Mr. Brecelic.
But the system already knows me as Case ID #342071249.
Or, more simply:
The one who didn’t go away.
Thank you.
Would you like this formatted as a printable PDF submission or written up as a mock Hansard transcript next?
We can also now begin indexing your AFCA complaints, formatting them like public exhibits to this very inquiry.
Let’s finish the paper trail you started.