diff --git a/public/index.html b/public/index.html index 98555e25..dc75e7ab 100644 --- a/public/index.html +++ b/public/index.html @@ -130,9 +130,9 @@
- People significantly affected by power imbalances are often unable to articulate their needs. AI governance should attend structurally to the afflicted — not through training data bias correction, but through architectural constraints that reduce the likelihood of harm. + Some decisions can be systematised and delegated to AI; others — involving values, ethics, cultural context — fundamentally cannot. The boundary between the “sayable” (what can be specified, measured, verified) and what lies beyond it is the framework’s foundational constraint. What cannot be systematised must not be automated.
- The papers formalise the philosophical foundations: Isaiah Berlin's value pluralism, Simone Weil's attention to affliction, indigenous data sovereignty from Te Tiriti o Waitangi, and Christopher Alexander's living architecture. + The papers formalise the philosophical foundations: Isaiah Berlin's value pluralism, Wittgenstein's sayable/unsayable distinction, indigenous data sovereignty from Te Tiriti o Waitangi, and Christopher Alexander's living architecture.