i was so excited to be done with xcvr tonight, but then 2 seconds into one of my first user tests, it all broke :|
i’ve diagnosed the issue, but yowza
i just wanna be doneeeeeeeeee for now at least
i would like to close my eyes for 10,oo0 years
or maybe i want motion
sometimes when i get really cooped up, spending all my time working on a project, i think a lot about a game designer tracing the lines through 3d space in my house that my ai patrols along
last summer i was playing rain world, and i feel like that game like no other captures movement in life. the way how you maintain your little territory and carry out your little patrols policing it, even as you explore. an abundance of caution, the construction of a bwo and the p0pulation of it with intensities. rain world is 4x.
but i want motion, absolute motion, a line of flight which draws itself forever onwards toward a new happiness
a thing i thought yesterday while watching the new non 3b1b 3b1b video. 0f course large language models basically take structural linguistics as an assumption: the concept of a language or semantic embedding is structuralism at its core, the idea that there is some basic structure to language or meaning that we can understand through differential relationships.
as the video goes, in stable diffusion, we have two contradictory steps, the first is the addition of noise, and the second is the reduction of noise towards the desired concept. if we only reduce the noise, the end result has no detail, and is basically the statistical average of the desired concept.
of course, the critique of structuralism is poststructuralism. idk wtf it is, but it’s must cooler of a word than structuralism, so structuralism is canceled. as i understand it, it basically has that if meaning is based on differential relations those differential relations are always shifting (this is the word they love… shifting…)
regardless, the thought i had was this: an idea starts out as a half0remembered grafment, then you start to refine parts of it, a shape comes out in the distance a figure, the fragments defragmentate, then all of a sudden as if out of nowhere there is a brick house assembled.
but all throughout this process of refinement, our relation with the idea is shifting, and inside the idea, its internal organs are shifting and really ideas are outside us, giant noumenal beasts out there, shifting which appendages they let us couple our minds to.
i think that i’m skeptical of ai on some level because at this moment, it produces really good superficial details, but i don’t think that the structuralist model of things, while mathematically elegant, computationally feasible, and ideologically convenient, really allows for much semantic detail.
i know i write this all glossily and i do this because i struggle to phrase the dumb things i think well. but i feel like in the same way that there’s some thing as visual detail that diffusion is unable to account for without introducing a contradictory component which prevents premature resolution, there must be a similar sort of thing as semantic detail, which i think currently all models lack in that they sort of are forced into a one conceptual way of approaching a thing, so even if on the surface they have a whole lot of bells and whistles, there’s no substance in the innards.
i think philosophy and art both rely on a kind of internal tension based on these shifting relations. also humor, of course. but i think in russian formalism, there is the distinction between prose and poetry. where prose works through an economics of scarcity to get you there, poetry works through an extravagant expenditure that in the end leaves you happily back where you started (or some other nowhere that you almost could see coming)
this is to say, i think the world would be a better place if the ai nerds took 10 years t0 read derrida, and added some poststructuralism
i’ll consider ai quite good when it’s woke
-rachel r.p.n