E56| Metaphysics & Power in the Digital Age, w/ Forrest Landry & Alexander Bard

Receive updates, invitations, and philosophical correspondence

Forrest Landry is a philosopher, writer, researcher, scientist, engineer, craftsman, and teacher focused on metaphysics, the manner in which software applications, tools, and techniques influence the design and management of very large scale complex systems, and the thriving of all forms of life on this planet. 

Read more about Forrest here. Access a repository of his work here.

Alexander Bard, born in 1961, is a philosopher, writer, artist & record producer. Alexander was a fundamental force in creating the Swedish music export sensation — making his small native country the world’s third biggest exporter of music — and promoting the information technology revolution. He is also a front figure for the new export of radical Scandinavian thinkers and has repeatedly been named one of Sweden’s brightest minds.

Alexander has written three books on the internet revolution, collectively known as The Futurica Trilogy, together with media theorist Jan Söderqvist. Their first collaboration The Netocrats was originally released in Swedish in 2000, became available in English in 2003, and has since been translated to a further 16 languages with total worldwide sales exceeding 340,000 copies. The second book The Global Empire was originally released in Swedish in 2003, while the third instalment of the trilogy The Body Machines was originally published in Swedish in 2009. These latter two works were released in English in 2012, completing The Futurica Trilogy, in which the authors present their philosophical vision for a global and increasingly virtual society, as a consequence of the interactive revolution.

More recent works include Syntheism: Creating God In The Internet age; and Digital Libido: Sex, Power, and Violence in the Network Society, which will form a trilogy upon the release of the final instalment still in writing, currently titled Process and Evemt.

Alexander has frequent discussions on the podcast Sweeny vs Bard, searchable wherever you listen to podcasts.

Metaphysics & Power in the Digital Age

Wed, 2/24 5:32PM • 2:21:06

SPEAKERS

Forrest Landry, Alexander Bard, Tim Adalin

Tim Adalin  00:02

I would love to have a conversation with both of you actually solely about distribution. It’s definitely relevant to discussion about what’s going on at the moment with the creep of censorship and the importance of building integrous networks. It’s definitely an interesting conversation to be had all on its own. 

Like so many of the conversations we could have. And in particular, because both of your works and thinking are so broad, with so many unique ways of using terms with different styles and traditions behind them, there’s a certain sense in which proceeding from the attempt to come to clarity about some fundamentals, or broad themes we can all bite into together, might be the most helpful way of going about a conversation. I’m also perfectly happy to just sit here and say nothing and listen to whatever wants to emerge as well. I’m not saying, hey, we have to have the conversation this way. However, I do have a way to potentially go about how we have our conversation, if that’s of interest to you?

Forrest Landry  01:27

I’m certainly open. It had been suggested to get an overview. Alexander, you had made an offer at one point to help me to understand your work. I thought that was really generous.  I am really interested to do that. I know I need to learn your terminology. So I am available to whatever your interest is.

Alexander Bard  02:12

Same goes for me. It is hard to understand each of us.  We each use such different vocabularies. This is due to the backgrounds we come from. It just proves how fiendishly hard it is to communicate these days — especially, when you’ve got to do it on the levels where we work, which is like systemic thinking. Number one, people who have a talent for systemic thinking are incredibly rare. Number two, to think is one thing. To actually put it into practice is something entirely different. And number three, we also use very different languages. 

Alexander Bard  02:55

So for example, when I read you for the first time, I was like, “okay, who is this guy?”. What is he writing, and why is he writing this way?  And I kind of decided that, okay, imagine a really talented computer scientist, who’s kind of figured out that there’s a future war between men and machines or something like that. And he’s trying to make men understand how machines would actually think if they had language, which they probably will have sooner or later. And it turned out that he was kind of a Nietzschean ecologist and all of this as well, right. That’s how I experienced Forrest when I read him. And I took the time to understand it. The way to read is to not jump from one word to the next thinking you’re going to figure it out as you go along. You can do that with a newspaper article, you can do it that way with just about anything these days which is sort of meant for a mass market. But if you’ve got to dig really deep into say, serious academic work, for example, or philosophical work in this case, you’ve got to sit down word by word. And if this word you don’t understand, go and figure it out, get a damn dictionary and then try to interpret it from the context. Otherwise, you will last in no time at all. 

Alexander Bard  04:02

But even decently intelligent people can read Forrest Landry or Bard and Soderqvist. The trick is, though, to understand the way to read is to read word by word by word. And then you will know that there’s a certain vocabulary we use, the terms return, the concepts are repeated, and then you start getting it. There’s no other way. I think Forrest agrees with me that we both make a very, very serious attempt before we publish anything to actually make it as pedagogical as it could be possible. But Soderqvist and I promised each other when we started writing 25 years ago to never compromise on the quality of our work. We do not make ourselves more accessible and thereby lose the content while we do it, because then there would be no point in us writing. There are so many pop versions of us anyways. That’s not something we’re concerned with.

Forrest Landry  04:47

Correct. Yeah, I’m with all that. I think the only addendum that I would put in is that in an effort to try to make it a little easier, I wrote somewhat holographically. So in effect, rather than expecting the person to go word by word, I figured that if they skipped parts, and came back, as long as they read through multiple times, so there’s a lot of forward linkages, where the meaning of the terminology doesn’t become clear until later. Or until you see it in the context of multiple other usages. I attempted to pick words that would be as close to the meanings needed, as were available. And then to use that as a way to bootstrap understanding to deeper layers. If there are multiple passes made, then there is a continual refinement that’s going on as people encounter the material.  That was essentially a methodological device that was put in particularly to communicate some of these ideas.

Alexander Bard  05:53

I agree completely. And, by the way, just having a dictionary next to a work of philosophy doesn’t mean you’re gonna understand the work at all. But I just recommend people not to think that they can jump forward and somehow get it, rather than like you said, reread it, and reread it again. That’s what I do with the Masters to understand what they’re doing. And that’s what I do now when thinking about the world we live in today. Otherwise, I wouldn’t be a philosopher.

Forrest Landry  06:18

So how would you like to begin? What questions or topics or directions would you hope to go in this conversation with me? I mean, like I said, I’m genuinely interested in learning more about your work in a more interactive way. There’s also some of the stuff that Tim suggested as far as orientations for this conversation. And again, I’m actually feeling receptive more than projecting, so I would offer for you to initiate in that sense.

Alexander Bard  06:46

Okay, so we did get an email from Tim with some suggestions, and Tim might want to fill in as well.  

Tim Adalin  06:51

Yeah sure.

Alexander Bard  06:52

It’s a really great start. So the way I work with Jan Soderqvist–and we’ve written five books over the past 20 plus years, and we’re working on our sixth book to be released next year, you can find Bard and Soderqvist on Amazon, if you’re interested, so that’s over done with–but anyway, the way we work is that we basically use human beings as a constant, and we use technology as a variable. We started writing in the 1990s, having discovered that most of sort of continental philosophy, or the kind of philosophy we loved ourselves wasn’t concerned with the major changes of the 20th century, which were, for instance, technology was taking over the world increasingly. And surely, after August 6, 1945, we had an historical date, when we blew up an atomic bomb which changed things forever. Now, if you want an event, that certainly was an event. 

Alexander Bard  07:42

There are also other events, for example, we had the first cosmonaut, you know, up in the sky, who was showing us pictures of planet Earth being green, and blue and beautiful, in a very, very large, very, very cold universe. And these pictures are also now sort of ingrained in our minds as well. And that was also an event that shaped us in the 20th century. And the funny thing is that philosophers basically disregarded this, which is sort of like…well — you also had cosmology, where we discovered there wasn’t just the Milky Way!–there were billions of Milky Way’s out there! And the Big Bang, probably a Big Bounce these days, also completely changed our worldview. And while we lost a sense of space on this planet, because we know here directly, the three of us are communicating in real time, one another, sitting on three different continents, we are literally evidence of the fact that space has disappeared on this planet. But space as a concept for the universe is of course exploded. So all of these things have to be taken into context, if you’re going to do contemporary philosophy. And philosophers just wouldn’t, because they would just go on with their sort of old academic themes, and quote Heidegger for hundred times or something like that, which I think is completely irrelevant since the internet is taking over the world, we’re all getting connected to one another.. time becomes absolutely essential as space has disappeared in most of our lives. So the time axis is now more important than ever. And this on top of the fact that we have climate change and existential disaster about to happen. 

Alexander Bard  09:14

Now, all of these things need to be worked on. And I think that’s where you and I agree Forrest. And the way I did it with Jan to put it very simple is that, as human beings do not change that much–they might have the desire to fuck people who are more brilliant than the previous generation, but they never do, because when people get drunk or high, they fuck all the wrong people, and they’re just about as idiotic as they ever were. So human beings are the same, you know. They’re ants with pretensions, as I call them. So we had that for the last 10,000 years. And a fat old lady sat down somewhere in Babylonia some 5,000 years ago, and she managed to fool some people into the idea that the permanent settlement would be a good idea. And they had enough harvesting of, you know, grain or corn or whatever, to make enough beer to actually believe it. And that’s called civilization ever since. 

Alexander Bard  10:01

No, humans haven’t really changed. I’m a firm believer in the sociant. The sociant, the idea that there was some 60 to 70,000 years where current hominoids, homosapien essentially were created the way we were. And we were shaped during that era. And we haven’t had much time to adapt to anything in the past 5000 years, which is exactly why we created the disaster we’re moving towards. So we can use human beings as a constant. But technology then becomes the variable. And technology is now operating at a dramatic change. The pace of change when it comes to technology is dramatic. Yes, technology gets stuck in certain areas. Peter Thiel is very happy to talk about that just because he doesn’t get his, you know, flying car or whatever he wanted. But when it comes to the actual things that really do matter, like what does it mean to be human, when 8 billion people through the computer / smartphones that are directly connected to one another at all times? And when I can joke with people these days that say that you think you can go offline? Have you heard of surveillance cameras and satellites, they’re like everywhere, you’re never offline, baby. You’re all online, because the satellites are now basically a web around the planet interconnecting everything, including all machines and technologies too. 

Alexander Bard  11:17

So this sort of hyper technological phase we’re in right now. Existential risk. And a constant, who is the human being, means that studying the human being all over again, except for anthropology, isn’t that interesting, it’s really the relationship between man and machine which fascinates me. And there’s tons of work that needs to be done here. And a lot of that work has to be systemic work, which is I think your genius Forrest. And there are very few people who can think man versus machine stuck on a planet with limited resources as systemic thinking.

Forrest Landry  11:52

Understood, agreed. So I’m basically with all of that. I think, a couple of clarifying questions and just kind of orienting my context with respect to you. So you’ve outlined the kind of principle issues. I very much agree that the human being is a constant: that I’m not going to be changing the nature of the person in any substantive way. And so to some extent, the places where there is mutability is going to be in the use and the integration of technology. So the man machine relationship. I do have one question, which is I do regard culture as mutable and very much defining of what the relationship between man, machine and nature is going to be. Obviously, I don’t regard nature as mutable either, but definitely impacted by these choices that we make. So in other words, the natural laws don’t change, but whether or not the ecology remains is an open question. So I guess the first thing that I would I would ask is, what do you view as the relevance of culture with respect to the work that you’re doing in the man machine relationship? And I guess, what is the longer term..– I think I’m actually a little confused about… First of all, I agree with the nature of the technology takeover: that you can’t get away from it. But I don’t know whether or not you’re advocating for essentially a kind of technological singularity, or you’re advocating for a kind of, I guess, I don’t know what your advocacy is for actually. Like I get the sense about the sort of cultural religious element, but I don’t know how that fits. And that’s kind of why I’m asking about this.

Alexander Bard  13:44

Okay, great. So my two heroes are Hegel and Nietzsche when it comes to Western thinking, and their brilliance lies very much in the fact that they’re descriptive, rather than prescriptive philosophers. If you jump to what I’m advocating, like I’m going to be some kind of preacher, I prefer not to go to that place, at least not very quickly, rather than to stay in the descriptive mode for as long as possible, which I also like about your work. So I read you as a descriptive philosopher, because I think that’s actually what’s brilliant here. As we get closer to clarity, as more people begin to understand the complexity of the issues, and understand the issues, understand the complexity itself, it gets easier to get to the solutions. But this is for example why, in the new book Process And Event, which is going to finish off the Grand Narrative Trilogy that Soderqvist and I write, we actually took the decision to celebrate the Bronze Age and the engineers of the world for whatever they can do, because we think we need them more than ever. 

Alexander Bard  14:43

And we’re going to basically tell that our own kind–you know, the pillar saints of the boy pharaohs of history, as we call them–whatever happened around 800 before Christ has been celebrated as the axial age in academics for the past 300 years was actually a disaster. Because it was the pillar saints and the boy pharaohs of history that actually took us to the brink of extinction. It wasn’t the engineers. Engineers, basically they get a drawing. And they get an idea, they’re going to build a skyscraper, and they build it. And whenever they invent new technologies, they usually event technologies that kill them anyway, it’s called the guillotine syndrome for a good reason. So we don’t really have to be scared about engineers, we need them more than ever. We need sophisticated, fantastic engineering. We need AI to be involved in that. If we’re ever going to build a fusion power plant and solve the energy issue, we probably will have to wait until the AI can help us design the sort of reactors we need to make that feasible to begin with. So I’m all for that sort of symbiotic intelligence and symbiotic transcendence. Even that’s possible out of man machine relationships. 

Alexander Bard  15:46

That means we need to rewrite all of history as Hegel would say. We need to rewrite history completely. We need to go back and say, why on earth did we celebrate all these pillars saints who were just full of their own egos sitting on their pillars in the woods meditating all day long forgetting about the material world out there. Well they’re not much use now, are they? No, they’re not. They just end up as crystal healing commercialists in California or something, not much else.  I don’t think Buddha’s fans were any better than anybody you find in el Sur California anyway, to be honest about it. 

Alexander Bard  16:19

So the real issues at hand here are: culture really kicks in and starts with civilization. It’s meaningless to speak about culture before that. Whatever sort of nomadic tribes of hominids that were walking across the planet, they were never more than 3 million at any given time because you can’t support more than 3 million people if you’re going to go for hunting and gathering, right? Once you settle, you create much larger populations. And eventually you create more density and more weight onto the planet eventually for survival. And you know, like mutations now with the pandemic, you also increase the risk of somebody coming up with an atomic bomb that can blow us all up. So the longer you stay along the time axis, the less we understand who we are and what we’re doing. And the more dense the population, the larger the population gets, the higher the risk gets that something terrible is going to happen. 

Alexander Bard  17:10

So that is culture. That is the honest idea of culture. I think Sigmund Freud put it out firmly in his book Civilization And Its Discontents. I think we even had an opening chapter in our Digital Libido book that says that civilization and its discontents and everything outside of that too. It’s like everything is a discontent for humans. But the point here is that culture, which is the word Sigmnd Freud is using in German here, kultur. And what he means is that what we created over the last five thousand years is something we both existentially and mentally feel increasingly frustrated with because we created a monster that is no longer us. We created a monster, for example, that is so full of shortcuts and quick fixes, that if you’re 20 years old today, and you don’t like Wall Street, you start an Occupy Wall Street, and Wall Street guys know you’ll be over in three days, because your attention span is now so damn short, that after two days, the Wall Street guys will come down to the street and greet you with an Instagram camera and sell you a fucking t-shirt that says “I was there”. I mean, these 20-year-olds, there’s no way they’ll be winners, that entire generation is lost already. We’re going to lose them because we don’t understand digital. And we don’t understand what these technologies do to us. And we must get to a much bigger picture to understand the smaller phenomena within that picture. The desire is to even be like a satellite as a philosopher, and look at the largest possible picture to then be able to zoom in on the details, but also be able to see the connections that other people do not see. Otherwise, how are we going to see even remotely the possibilities that could take us out of the current predicament. That would be impossible.

Forrest Landry  19:04

So again, I’m basically wanting to create a comparison, or sort of an orientation. So in other words, again, I’m partly in descriptive mode. I think your your notion about descriptive and prescriptive is correct. And my work largely is descriptive, but it attains a certain effectiveness by being prescriptive and then finding out the implications of that prescriptiveness.

Alexander Bard  19:35

We should say you are a scientist and I’m not so that’s perfectly fine with me. Because I expect engineers and computer scientists and others to be able to reach conclusions while I’m just trying to get the biggest possible picture and then provide it. There’s a difference between us that actually we could play with here that’s very creative.

Forrest Landry  19:53

Well that comment helps a lot. It gives me a sense of understanding why I was misunderstanding you. So, again, to try to create correspondences between our work, because I believe that’s useful. Again, it’s sort of a language exercise at this point. But first of all, I really love the articulation that you gave to the flaw of the axial age, I think is, is maybe a way to put that. And so, going back to the notion of really understanding the human condition. Like if you treat the human being as immutable at one point, you basically said, “well, since it’s immutable, let’s ignore that.” But I think I’ve gone the opposite way there. Part of what I have been attempting to do is to, by understanding the human condition, understand what the fuck up of the axial age really is. So in other words, you know, the boy pharaohs, I mean, I’m just going to basically say, the sociopathy of the of the age, right? If you look at and you apply DSM five criteria to religious figures, and diety manifestations and such like that, you end up with a whole panoply of diagnosis potentials that actually aren’t very complimentary. And so you know, and I’m sure you can unpack that coded language at leisure and find all sorts of interesting things to riff on.

Alexander Bard  19:56

I will certainly credit you for “the sociopathy of the axial age”, let’s start there, I agree with you completely. I mean, hey, remember this thing. Whenever we have abundance in human history, that’s when idiocy starts, because we can afford to let the idiots live. But the problem with that is the idiots will be rampantly everywhere, and they will compensate for the fact that they’re idiots. And they will be pompous about it and pretentious about it. I’m not a big fan of either Christianity or Islam, but I’m not a big fan of pop. I’m not a big fan of things that get popular because they usually get popular because they flatter people, right? 

Forrest Landry  21:54

We don’t need to reiterate those points. I get it, I really do. And furthermore, I’m with you on all that. So you know, I don’t necessarily have an intention to go off pissing a lot of people off. So I don’t mention it very much. But you know, there’s definitely a sense here, though, that if you factor in the evolutionary systemic process, so in other words, evolution can be understood in a kind of rigorous mathematical sort of way, a kind of process or computer science oriented way. And that gives you the ability to use it as a sort of tool and to model things a little bit. So for example, as you mentioned, when you have people come out of tribes and into cities and starting to have civilization at larger scale then the compensatory mechanisms that would normally have suppressed sociopathy, dark triad characteristics, evaporate, and now you end up with dark triad personality characteristics as actually being adaptive to the urban environment. 

Forrest Landry  22:53

And so I think that the axial age is itself an outcome of that process. And it’s not just a manifestation of it, but it then became kind of a core leverage point for those processes going forward. So in other words, the whole hierarchal religious precepts to basically create a kind of organization for society that maybe created some amount of differential advantage for certain people relative to others, depending upon the degree to which they had those kinds of characteristics. So in that sense, you know, it’s become critically important, from my point of view, to recognize that those elements are part of the human condition, and that if we are going to essentially compensate for the use of technologies–technology is an amplifying factor, it’s going to, as you said, with nuclear weapons, we have now capacities that exceed that of most concepts of what deity would be, you know, think of Thor’s hammer and how, you know, he can create lightning and all, and then you’ve got, you know, nuclear weapons that can create 100 lightning bolts in a split second. 

Alexander Bard  24:04

Yes. 

Forrest Landry  24:05

And so in this sense, there’s a huge degree to which there is an amplification that technology has inevitably. And when you combine it with the sociopathy thing, things go rather bad rather quickly, right? So in this sense, I can’t just rely on the engineers because the engineers being human and in the context of humans with these characteristics are in effect going to become subject to that influence. And of course, you end up with, well, basically all the problems we currently have. And so in this sense, I’m really leveraging cultural dynamics fundamentally as a kind of moderation for the human elements in the context of the technological element as the last gasp of stabilization possible.

Alexander Bard  24:55

Oh, yeah, that’s why philosophy influences architecture. And at the end of the day, we’re talking about architecture here. We’re talking about architecture of infrastructure. We’re talking about urban planning. We’re talking about how do you maintain some kind of political administrative order that can protect us from existential risk to begin with, and then can build a better world.

Forrest Landry  25:15

One that’s not going to be compromised by the dark triad manifestation, the sociopathy manifestations that effectively did compromise so effectively for so many at this point 1000s of years, right?

Alexander Bard  25:30

Why don’t you tell those who are involved in our conversation what you mean with dark triad, so we get that right. 

Forrest Landry  25:35

Sociopathy, narcissism, machiavellianism. The kinds of things where an individual operating in a self oriented way privatizes Commons benefits. And by doing that they basically divest the commons of value and make the world more fragile. So in the transactional schema between individuals and communities, that in effect, you end up with a pumping of value from the community to the individual.

Alexander Bard  26:11

Using our vocabulary here, for those who are familiar with it, that means what we call those who ignore imploitation. We regard the principle of imploitation as way more important than exploitation. And imploitation is basically what an old woman would tell you in the tribe, when she would tell you “this is a resource you have, you’re going to try to get as much out of this resource you possibly can by keeping it” okay, which is completely opposite of exploitation. So we would regard the dark triad, the way we use it, Soderqvist and I, as these are characters that have no respect towards the imploitation principle at all. So they would then exploit everything they get close to, minds, you know, resources, whatever. They couldn’t care less. That’s the problem with the dark triad personality types.

Forrest Landry  26:53

Understood and agreed. 

Tim Adalin  26:55

Forrest–

Forrest Landry  26:56

Yes.

Tim Adalin  26:57

Can I ask you guys to fold in a notion to this, because there’s something that’s come up in relation to prescriptive and descriptive. And I wonder if there’s not also something that should be added to that distinction. And that’s the notion of invitation. Philosophy as invitation. And I think about things like time, I think about us coordinating here. And I think about what enables choice appropriately, I think about an invitation extended. It’s not something I mean, I can describe where something’s at, right. And I’m not telling you, you have to do it. But I’m inviting the participation in something. And so when we talk about moving towards here, what I’m sensing is culture as something we can interact in as a way to reorient our collective whole in relationship to the change upon us.

Forrest Landry  27:51

I see it, I definitely see it. I think that part of the reason that it wasn’t mentioned, just to give you some context, is that when when we’re thinking about, or when I’m thinking about causal process, right… so for instance, we have choice, change and causation. And insofar as I’m thinking about the relationship between technology, and the human being, I’m thinking about the relationship between choice and causation. And change is like nature. So for instance, the choice adheres to the human. So the invitation would be to talk about how we enable choice. Whereas when we’re looking at, you know, say computer science, I can describe why a thing maybe has the characteristics that it does. So in other words, I can look back at nature. And I can say, well, you know, it looks like that when I do this, this happens, right? So I can say the boiling point of water is roughly 100 degrees Celsius, if the pressure happens to be about normal. And the the notion of descriptive, in that sense is observational. Whereas the notion of prescriptive would be that in a context of like a computer program, that I would set a variable equal to a thing. Or I would say, as a mathematician, from this moment onwards, I’m going to define this term to refer to this construct or this theorem or something. Whereas in the larger sense, when we’re talking about culture, it is an Invitational thing, because I can’t condition or control anybody. Right? There’s no thing I’m going to do that’s going to be other than influential, right, I’m not going to be able to cause a person to engage in a particular behavior and still have some notion that they have sovereignty or identity or any kind of personhood at all, if I’m causing things right, then I’ve made them into an object rather than a subject. 

Forrest Landry  29:50

So in that sense, what I’m very much interested in to do is to say okay, yes, we can regard invitation as a kind of recognition of others choice. And that if I’m being prescriptive, I’m being prescriptive in the sense of finite things, not people. So for instance, you know, prescriptive in the sense of trying to conditionalise what a machine can do, but not necessarily what a person can do, because that’s obviously not the case. There’s another level, which is, I think, somewhat relevant too which is that, you know, when we’re looking at, say, space and time. I mean, you know, Bard made the point very clearly, I think, that our notions of space have changed enormously in the last few 100 years. And it’s become collapsed, and that it’s become much more relevant to think about time. But to me, there’s a third element to that, which is the notion of potentiality, right? We have actuality. And there is potentiality, which is what could happen. So in the same way we can think about matter in space, as a kind of content context relationship, right? We have things in the universe. I can also think about forces in time: how much influence there is, how much, you know, pressure, or whatever. And again, it’s not necessarily a causal thing, but it’s an influential one. And then finally, I could think about probability in the scope of possibility. So if I’m really trying to understand things like risk, I need to basically be able to think about the potential, like the counterfactual of what could happen. How likely is an existential risk versus how impactful it would be if it happened? And right now, I feel that when we’re doing game theory, and that kind of stuff, first of all, the notion of how we think about those concepts is not nearly extensive enough. The math is not complete. It’s trying to treat game theory on a single level, rather than on a plurality of levels having different scales. That, first of all, has some consequences. 

Forrest Landry  32:14

Secondly, there’s a real notion here that you can’t do statistics on the kinds of things which we would consider to be existential risks, right? Things that are unlikely to happen, except once every 1000 years, and, you know, are basically not occurring very often are not repeatable enough to be subject to the kind of epistemic techniques that we would use with the scientific method. Science requires it to be both observable and repeatable. Well, existential risk as a thing that has such a huge impact is more towards the creation side of the world.  I mean, it has an existential impact. But it basically is so much change, so fast, and so infrequently, that it’s neither observable nor repeatable. It happens differently every time. And it doesn’t happen very often. And so the repeatability is effectively nil as far as our observational techniques are concerned. So, in this particular sense, having a really good notion of the potentiality of things is what, connecting back to your notion of invitation, right? What What do I invite? What risks do I invite into my current context in terms of how I’m, you know, discussing things or, you know, what kinds of technologies I create. 

Forrest Landry  33:37

And so in effect, to go back to a point that was made earlier, if I just think about the engineering piece, and I don’t think about the human dynamics piece, there’s a very high level of risk, that the things that we build will be even more toxic and more damaging than nuclear weapons. And this is particularly the case when thinking about things like artificial intelligence. At this point there are actually strong form proofs that we cannot expect and very much should not expect that artificial intelligence would actually be cooperative with any of our interests. They won’t actually be of benefit to humankind, no matter how Pollyanna our beliefs about that may be, you know, how much people may think that they have commercial interest to to go in those particular directions, it is a for sure foregone conclusion, that it will not work out the way they expect. Not in the long term. And this is such a critical point because, you know, in the balance between descriptive and prescriptive, it becomes possible to know this. And therefore, it becomes possible to make some assessments about the values that we would want to have with respect to what we invite in terms of human potentialities and technological potentialities. In a somewhat esoteric circle, there’s a statement do not invoke what you cannot banish. And in this particular case, I’m basically, as a philosopher who has walked the balance between invitation as a technologist–you know, I’ve built things, I’ve invited stuff into the world, so to speak–and I obviously, as an engineer, I think about prescriptive and descriptive quite a bit. And so in effect, there’s a real recognition here that without having a clear understanding of the balance between those three, we could very easily end up inviting things, because those boy Pharaoh’s, not thinking long term end up basically telling the engineers to build something that is absolutely dangerous. And now at this particular point, I think we have the tools to really understand and make better choices about this. And that it’s critically important that we do so. And that the only way we’re really going to be able to do that is understand the human condition well enough to be able to teach these concepts without getting hung up in the terminology, which I think we both understand as being very relevant. So I guess that would kind of be the way in which I’d sort of sum up.

Alexander Bard  36:11

I can just add to that that philosophy is an art form, for good or bad. I started with theatre when I was a teenager. I wrote my own plays when I was like 19 years old. And then I moved on to philosophy like 30 years later. But it’s like, I’m not going to write a banal book that says, you should do this, and this and this, and this. Jordan Peterson could do it, but then he’s just about making your bed and fixing your marriage or whatever. These are much bigger issues we’re talking about here. So you can’t do that. So when I said, pedagogical, that’s what I meant, inviting. It’s just like, you work as hard as you possibly can. We spend a lot of time and work with this to try to be as accessible as possible, because accessibility is inviting in itself. But then when you create a platform by being descriptive, and we mean: everything that ever happened in history, up until now, is resource material. And whatever we can use, if it fits into the narrative will certainly be there to describe the bigger overall picture that we invite people to see.

Alexander Bard  37:08

It’s more like, wow, that’s how complex the world is. Right? Okay, then people can sort of start moving around on the floor we’ve created as philosophers and start to create things. So this is why I’m celebrating engineers here in response to Forrest is basically as a dialectical response to the enormous focus on academics and, you know, philosophers and sociologists, and whatever you like. It’s like, I’m killing my own breed here. But I think after the last 3,000 years, especially, we can look back and say, it’s actually when we build stuff, and knew what we were planning to do when we built them–for example, we build cities between rivers, so that rivers would not go to war with with each other any longer, but actually create peace in between. Meaning we built a big temple. And around the temple, of course, soon there was a trading post, and trading is much better than killing somebody. So you would eventually realize that the bloodshed could stop, and you could actually have peace between cultures by creating a temple and a city between valleys. And after a while, we realized that these cities were sort of places also for some people to control others from and then became nation states and things. 

Alexander Bard  38:18

But if you look at history that way, we can use all of that material. And knowing we’re artists, it’s a bit like, I’m not going to write a play, if I work with theater to tell people what to think. I’m going to write a play basically telling that this is actually how complex and fantastic the world is, and then inspire them. That’s the invitation, to do something with it. I don’t believe in just opiates, for example. I think historically, we’ve proven again, and again, that dystopianism doesn’t work. It’s understandable. One of my fellow natives Greta Thunberg is probably the most famous dystopian ever by now, you know. But at the end of the day that’s not gonna work. I don’t think utopianism works either. And that’s why we talk about protopianism in our work. We talk about something where you optimize different systems. And obviously one of those systems has to be the planet itself, which has to be the basic system of them all. And if any of these systems operate in a way that the system in which it’s built is then being deteriorated, then that system has to be self punishing. So it stops itself from acting to begin with. Now, that could possibly be implemented in AI, for example. But we’ll see as we go down the road.

Forrest Landry  39:25

It’s interesting you mention philosophy as an art form. On one hand, I’m very much agreeing with you. On another, I notice that I actually operate far more as an engineer. And that in effect, it’s been in the capacity to be in both worlds that a large extent of the effectiveness of the metaphysics has come out.

Alexander Bard  39:43

Yeah, and I can add my response to that, which is anthropology. So I call myself an anthropologist when I start using data, which I do a lot these days. And then I go into the philosopher mode when I’m artistic. So it’s probably the same way.

Forrest Landry  39:57

Yeah, there’s a sense of science and obviously, again, the sort of descriptive, you know, observe what is, both about the human and the nature and the machinery. I think that I find myself a little bit more careful about the long term perspective. So in other words, when we say celebrate the engineering and we celebrate the specific art or we go and we promote these particular things, you know, this is a little bit where we get back into axiology. And I mean that in a sense of values, not in a sense of the boy kings. So, in effect, a lot of the attention and effect comes back to the, how do we deal with the boy kings? How do we deal with the sort of habit formation, addiction patterns, neural limbic system overload associated with hypernormal stimulus, and effectively create a capacity on the part of groups of people to be more coherent, to make wiser choices. So, for instance, insofar as there’s some deep built in biases, both individually, you know, in the part of specific  individual people, but also on the part of whole groups. So there’s a phenomenology to that, that to some extent, precludes my having the notion that I can do more than invite into the present. I can’t invite into the future. So I don’t find myself thinking about dystopia or protopia except insofar as risk analysis is concerned.

Alexander Bard  41:45

This is where we work with lynch mobs throughout history, because the lynch mob is when people get really idiotic in groups. And in a way, we are currently a lynch mob against nature on this planet, right. And then we work with the opposite, which is like what would then be the sort of good group, what would be the constructive creative group that understands the territory it’s walking through. And this is why the exodus mythologies throughout history are so important to us in our work. And we call it exodology, you know, to put a positive tint on it. Exodology is basically how do you organize people in such a way, that the overall effect of the group–again, systemic causality is what we call it–the systemic causality that works in a positive way, meaning at least that is sustainable as a minimum, right? That is what we call protopianism. And the word we use for that is exodology. And this again, my science here is anthropology. It’s kind of scary how closely connected we are, we just come in from two different angles. 

Alexander Bard  42:40

We work a lot with understanding what does it mean to be exodological, and be in a group that actually is aware of how it works as a group and aware of where it’s going and aware of understanding the territory it’s crossing. So it can move from the old to the new, say, from an old paradigm to a new paradigm, or even from one territory to another territory. But the opposite of that is lynch mobs. And this is exactly why our work also touches the political today so much, you know, with a lot of the stupidities that are going on at the moment with these lynch mobs going after one another. I’ve written about the alt left and the alt right now for years and warned people that they, you know, they could be a problem. And they could take the attention away from the far more serious systemic problems we have, to the extent that actually systemic problems will occur, you know, the disaster will happen. And I think that’s important. 

Alexander Bard  43:31

It’s really important to look at how systems operate, how human systems operate. We work with old classic concepts, like empires, nations, cities, you know, anything larger than tribe, because usually, when we live in a tribe, we actually work. It seems to work–tribes have survived for hundreds of thousands of years. These other social constructs are very recent in history. And we still have no idea what they are. We still assume that an empire like China or America at the moment is a good thing in itself. When it turns out at least the most wealthy people on the planet seem to prefer Singapore, probably for a good reason. Right? So yeah, absolutely. And again, where data has to be used, number one, in what we call this sensocratic way, like you have sensors everywhere, measuring everything, is obviously down to understand the direct causalities and the effects we have as human beings on what we do.

Forrest Landry  44:25

I guess this is probably where some of my questions come come to bear. And this is you know, first of all, I love the sort of way that you’re describing protopianism here. Exodology in the sense of leaving something confused me a bit, but I get the sense now the way you’re describing it–

Alexander Bard  44:45

–You know, a paradigm shift, for example. The people who first get out of the old paradigm and move to the new paradigm, whoever the winners are with the current shift to digital, they’re obviously ahead of everybody else, and at best, the other guys can sort of mimic them. That’s what we call exodology so that we historically get it right.

Forrest Landry  45:01

Understood. So in this sense, when we’re looking at the systemic understanding of the causal dynamics–and I love that you’re comparing that to lynch mobs, which is also something I think about (it’s the unconscious reactiveness versus the sort of conscious, to some extent, anticipatory, but really clear about, you know, where are we trying to go in the sense of, what are our values, how do we integrate the human and the natural and the machine worlds essentially. So, in a sense, the notion of focus feels to me, the way you’re describing it, you know, right on. It’s like, we’re clearly aligned here. 

Forrest Landry  45:16

I haven’t done, obviously, very much in the space of trying to connect it back to notions like empire or nation states, or conventional ways of thinking or modeling in that particular space. I’ve since narrowed my focus quite a bit. I’m looking at these same issues, but in a much more specific sense of, you know, how does that group actually operate. Because if we are unconscious as to our own nature, going into that particular group, we take the sociopathy with us. And even though you know, some people, and maybe the entire group understands the causal dynamics of the world and of the machine well, (and by world, I’m referring to the natural world, largely) but the idea here is that if we don’t have the anthropological knowledge built in, in a way that is itself not just causally reified, I want to go more than that, but at least that because otherwise, you still end up with new formation of boy pharaohs in the new context. And that, of course, just means that we’ve taken the worst parts of ourselves with us into the new realm. And so in effect, I’m curious to know, you know, to what extent has that particular dynamic and issue been integrated into your thinking? And what sort of compensatory things have you proposed as a way of compensating for that?

Forrest Landry  46:32

Well, one of the reasons I’m interested in a concept like empire is that there are some empires that were highly successful–at least they lasted for a long time. And there were others that imploded in bloodshed pretty quickly. And it turns out that for an imperial structure.. (and technology today is Imperial). We wrote a book called The Global Empire in 2003. Of course, that title was misunderstood. We knew it would be, and so do you when you write your books, too. But anyway, the global empire is basically the result of the internet protocol. It is just that technology itself will operate on imperial global level, because technology has no interest in borders. Borders are incredibly human things. 

Alexander Bard  48:02

So the human beings, though, as a reaction to that has gone even more tribal. We go into smaller and smaller groups, which is exactly what’s not gonna solve our problems. And so you need to kind of figure out what kind of systems are there historically, that you could promote and say, well, maybe you should look here, not there. And it turns out that during antiquity, Persia managed to create empires that held for about 2200 years because they were built on certain values. One of these the Persians built their first empire on, the Achaemenid empire was that we should not boil the children of our enemies in oil when we conquer their city, which should rather shock them by kissing the feet of their God and invite the king who we just won over and ask him to stay in power to be a local chief in the place we just conquered. And thereby they created the first empires on earth that actually stayed for 1000s of years. 

Alexander Bard  48:56

And most of our concepts today..I think the West starts with Persia by the way, it’s not a Greek thing or anything like that at all. The West is essentially the Middle East. And then the Middle East got a little extension called Europe and Europe got some cannon boats and a printing press and went absolutely mad and then almost destroyed the planet in the process. But that’s what Europe is. Europe is like a tiny little bit of the Middle East. It’s nothing else right? So the Middle East is the West. India and China is the East historically. And it turns out, when you look at it that way, the Persian empire was highly successful. And there was a major mistake at the end of the Bronze Age, which more or less caused probably the end of the Bronze Age, and that’s when the Egyptians started mimicking the Persians and invented the Egyptian Empire. Now there’s a problem if you only have one river rather than two. Another term for Egypt, as one of my assistants, Peter Tauson, calls it, is Monopotamia. Which is very clever. it’s the understanding that a culture that only has one river, will of course think we should only have one pharaoh, and he should both be the priest and the chieftain or the king or whatever, everything in one, and he should worship God and then everybody else should worship him. 

Alexander Bard  50:00

Now that lasted for six years, and it was horrible. It was basically Pol Pot’s Cambodia during antiquity. Thankfully, it fell apart just like Adolf Hitler’s Nazi Germany did. But the problem with these guys, these boy pharaohs, is they do pop up in history and they cause enormous havoc. Hitler was responsible for almost 100 million people who died in Europe. Thankfully stopped the Europeans for thinking so highly of themselves at least. But you know, the thing is that, okay, so you got a system that lasted over 2200 years, you have another system  that lasted six years. And we are now, philosophically, both you and I, we have to respond to the Communist Chinese dream, which is the only clearly defined dream for the internet age: where they put a boy pharaoh called Xi Jinping at the top. And all data the Chinese get their hands on will go to the central Chinese communist computer. That’s the conditions for working with tech in Shanghai, believe me, I’ve worked there, so I know that.

Alexander Bard  51:04

Okay, that’s an idea about how to operate the world. It’s probably quite feasible that China will go from being the worst criminal of all into actually having policies in place within the next 10 to 15 years where actually they could at least say, we’re saving the planet, what about the rest of you? In which case, it’ll be very, very hard to respond to them and not let them take over everything. So we work closely with people in South Korea, Taiwan and India, who really want to create an alternative to the Chinese, something the Americans are not that concerned with, because they think it’s like a military conflict or prestigious conflict, like two guys bullying each other in the schoolyard, when the world is on fire. Well, I’m sorry, but the Chinese have figured out the world is on fire. That’s part of their game. So we as philosophers, that’s why I also have to work with politics, I have to work and be very anthropological in the sense that there are systems–we do culture studies, as we call it–there are systems over time that seem to be much more sustainable than other systems are. And maybe we should look at those first.

Forrest Landry  52:06

I think this may be the the first place where you and I depart. On one hand, I definitely agree that we should absolutely look at history and understand it and study it really well, specifically for looking at what are the kinds of things that have worked and haven’t worked? And why did they work? And why didn’t they work? The examples and the comparisons you’re making I think are absolutely brilliant. And your overall assessment I actually also very much agree with, which is that, you know, the, the way in which the United States is conceiving of the situation is just completely inadequate to the needs of the situation. And so, you know, first of all, I guess, out of all of the agreement that I have with what you’re what you’re saying and what you’re doing, which is actually most of it, the place that I think that I’m coming back to is that, in agreement with the earlier notion that you also said, which is that philosophy has not fully accounted for the effect of technology, I therefore don’t presume that the historical examples are really going to be as relevant as we would want them to be because of that influence.

Alexander Bard  53:20

Oh, I agree. I agree. My point is that power sharing, or a power split, must be built into the system from day one.

Forrest Landry  53:28

I’m with you on that one. Okay. 

Alexander Bard  53:29

That’s what the Persians did well while the Egyptians did not. That’s what Americans still do well, thanks to the Constitution, hopefully, compared to the Chinese. So it starts there. That’s my point. A system in itself is not going to be intelligent with a boy pharoah at the top. That’s basically my point. 

Forrest Landry  53:41

In any manifestation.

Alexander Bard  53:47

The rest I’m totally with you Forrest, I am a philosopher of technology.  I came out of the continental tradition in Europe, as one of the first major proponents that if we don’t do technology now we can’t do ecology, we can’t do anything of the really meaningful things we should do as philosophers, I’m not interested in looking down here, you know, looking at myself, my soul and being existentialist or anything. We don’t have the time for that any longer. So I’m totally with you on the technology aspect. There is a limit to how much we can use history to understand the current predicament we’re in. Absolutely. And it’s only from understanding what it means to be human and organizing humans between themselves, there’s a point to it. The rest is down to an entire new field compared to what we had historically.

Forrest Landry  54:29

Okay, so in that sense, I’m feeling more comfortable with this approach, because the the underlying notion here of sharing power as a way to create stability, particularly in the face of, of strong exogenous pressures.. So for instance, if you treat technology as a as an exogenous pressure, which isn’t technically the case, but in some respects, we can actually model it that way more correctly, there’s a need here for us to actually be not just not a mob in our sense making or not a boy dictator, because those are the two alternatives that are mostly presented, ie. you have democracy or socialism, but democracy resembles the mob too much. And socialism resembles the boy dictator too much. And neither one of them, in all of the attempts that have been given, right, which, obviously, democracy has been tried a lot less, but I think you’re right to point that it is fundamentally a power distribution question. 

Forrest Landry  55:32

And more specifically, that it is particularly and principally about, not just that the power is distributed, but that effectively it is a, and I go back to a democratic way of thinking about this to say, of the people, by the people, for the people. But not as a mob. And not as some sort of hierarchical power structure, because those would be the two presented options. But to go back to something like what Tristan Harris would say, what’s not on the menu or not on the menu yet, right? So this notion of a third option, which has some of the characteristics of, you know, distributed decentralized power and authority. And it doesn’t devolve into either a kind of, you know, hierarchical system with a, you know, implicit petty dictator, and even though it might not necessarily be explicit, this goes back to the tyranny of structurelessness, if you’ve read that article. So in effect, it’s kind of like, wanting to account for the deep psychology or the deep anthropology well enough to have pre compensated for the dynamics that would otherwise centralize power, particularly because technology itself is inherently a centralization force. Right? As you said earlier, it does not honor borders, and will effectively become a kind of centralized tyranny of sorts. I mean, you mentioned that in your empire concept.

Alexander Bard  57:11

Yeah, The Global Empire. That’s what the book is called. That’s exactly what it says, yes.

Forrest Landry  57:15

So in effect, the notion is that the exogenous pressure of technology is inherently centralizing. And that centralization is inherently problematic. So in effect, it’s dealing with that issue fundamentally, when we have a kind of unconsciousness of our own nature that some of us would prefer centralization, when most of us would know that that’s not a good thing. And so that’s kind of the crux of a lot of the issues, or a lot of the dynamics is, is understanding specifically how to deal with that.

Alexander Bard  57:53

It becomes a lot easier because of the Facebook crisis, and everything, at least, these were very marginal ideas that you and I had with some hackers maybe five years ago, but becoming really mainstream today. So that’s a good idea. Decentralization is now a term everywhere. So that’s a good thing. 

Tim Adalin  58:12

It Iooks like we’re just about coming to a more explicit response to some of those orienting questions. Perhaps not so consciously, I’m not sure if that was intended. I’m going to read them out, because I think it was part of the framing coming in. And it will be interesting for listeners. 

Tim Adalin  58:40

So the first question was, what are the fundamental dynamics of power in our emerging digital age? And we’ve begun to lay some of the context to respond to that question. And the second question was, in the context of an emerging network of networks, what are the opportunities and risks we face as participants, and hopeful contributors to culture? 

Tim Adalin  59:13

So let me just make a couple connections here. There was one theme that was coming through. We’ve mentioned narcissism in relation to the sociopathy of the axial age. And the connection now between narcissism and centralization, I think is interesting from a psychological and now technological perspective, in the sense that Forrest is outlining it’s tending towards function. But I’d actually like to step back and if possible, ask you each as briefly as possible though it might not be possible, to offer a definition of culture, and just before I do that, I’d like to also put forward the case that there was something, Alexander, you said at the beginning, which was a link between culture and civilization. And that before there were civilizations, and here, you’re referencing the sort of bronze age, I imagine which sort of roughly 5000 a little bit before. But I consider our interest to be in what the healthy dynamics of continuity were previous to the Bronze Age. Because there seems to be also a recognition that there were these emergences of a sort of narcissistic or otherwise parasitic on the commons type relationship that became structurally imposed, and that there’s a link there between civilization as well.

Tim Adalin  1:01:00

So it would look to me that there’s something in the essence of culture, which we ought to consider as stemming from further back. And I also am curious to question the inclusion of certain indigenous perspectives, particularly in Australia, where you have a continuity of, in many respects, a decentralized but coherent culture of network tribes connected through ritual and song line and custom that existed for many tens of thousands of years, and actually watch the rise and fall of attempts to centralize in permanent settlements associated with a narcissistic type impulse very much as part of that process that were then rejected. And seem to fail in that sense. So I feel like there’s something in there that’s been a little bit missing from the cauldron of what we’ve established so far. But as for my piece I’m finished here, and just if possible to ask you each and Alexander might have a little bit more to respond to here, as I mentioned him perhaps more in particular, but could we see if we could define what we mean by culture? This would help bring clarity at least for me, as maybe then we look towards understanding the fundamental dynamics of power in our digital age for the purpose of the kind of balance we’re looking for, in this healthy protopian kind of sense. So perhaps, Forrest, it looked like you were there keen to say something, to go first?

Forrest Landry  1:02:42

Well I was I was thinking, I think I can do the culture thing pretty compactly. I get the sense, Alexander that you would probably want to spend some real time on this.

Alexander Bard  1:02:52

Yeah, I can explain how we work with tribal mapping, because we do as anthropologists. So everywhere was like Australia, you don’t need to worry Tim. Everyone was like Australia. But we can go back to that. But yeah, why don’t you define that first Forrest, if you like to. Culture and civilization are the same thing in most languages. To a German it would not make sense to speak about anything else, because the word for civilization in German is kultur. So it is the same word.

Forrest Landry  1:03:15

So I treat these terms in a somewhat technical way. There’s a number of overlapping specifics, but if I’m thinking about civilization, I’m thinking about essentially two fundamental capacities. One, the communicative capacity of how are we civil with one another, right? So not all interactions have to involve weapons, essentially, you can talk to somebody and you can engage in trade and so on. The other capacity is the capacity to be in cities. And so when we, when we saw the book, Sand Talk that came up, you know, they they basically made the point that cities just aren’t stable, right? You’re thinking about the continuity notion. 

Forrest Landry  1:04:54

So the book basically rejects the notion of civilization, because it depends upon the notion of city. It didn’t mention civility very much. But that’s sort of written in there with the narcissism piece that was mentioned. And when we think about the dynamic of city and we’re trying to say, okay, well, what is it that needed to stabilize city? Well, we need to stabilize it ecologically. We need to stabilize it energetically, in the sense of power flows, not political power, but literally things like food, whether there’s enough food and water, and so on, just to keep the population alive. Electricity and things like that. And then finally, whether it is socially stable. 

Forrest Landry  1:04:55

So when we start talking about socially stable, we’re starting to talk about power and culture dynamics. But because the notion of city was invoked as part of the notion of civilization as a fundamental concept, then we need to actually pull in the full architecture of what that is, which basically has, as I think about it, three particular levels. You have what would be the the notion of economics or finance, how the people in the city interact with one another, how they trade power, social power, political power, how the governance works, all of that sort of stuff. Then underneath that you have infrastructure, so, roads, buildings, power grids, water distribution, sewer management, transportation, all that sort of stuff. And then underneath that, you have culture, which would be the value systems, the artistic elements, the languages that are used, the idioms, the narratives, all of the specific ways in which people hold identity and think about themselves as groups and stuff like that. 

Forrest Landry  1:06:01

So roughly speaking, these three notions of group choice making in the sense of governance, politics, and economics, finance, and all that sort of stuff, how money moves, the notion of infrastructure, and the notion of what we’re calling culture here are distinct, inseparable, and non-interchangeable. And there’s a dependency relationship. That culture comes first. Alexander, you mentioned this, when you specifically said, you know, the first thing they build as the church. They create an instantiation of the value system that moves up into the first level of infrastructure. So you have the church building itself is the transition from the cultural layer, into the infrastructure layer. And as the infrastructure layer builds out, you know, you end up with a trading post thing, and then before long, you have the finance layer emerge. So there’s a clear dependency between the layering here. 

Forrest Landry  1:07:00

And so when philosophers and more particularly people in the larger world, obviously, Alexander has not made this particular mistake, but I see it made a lot, is that people propose financial instruments, you know, Bitcoin and things like that as a remediation of world problems. And it’s like, well, yeah, that’s nice, because you’re thinking about it at all. But you’re thinking about at the level of finance and governance, and you’re not thinking about it at the level of infrastructure, except maybe to think about trustless systems. But you certainly haven’t gotten down to the level where things are really happening, which is at the level of culture. And without understanding the sort of anthropology of culture, which I think Alexander has actually done really well, you really don’t know how to think about architecture, you don’t know how to think about infrastructure, you don’t know how to think about engineering, because the engineering is going to result as a result of human beings, engineers, and the social influences that they are experiencing. So in this particular sense, I think we’ve both identified that, we really want to look at the cultural dynamics to understand how the infrastructure comes into being and what the implications of that infrastructure are going to be, in terms of mob based choice making, or centralized choice making, that happens at the economic and governance layer.

Forrest Landry  1:08:26

And to sort of kind of connect the dots a little bit, because you mentioned tribal dynamics a little bit. So on one hand, I find that in the same sort of way, that the notion of civilization was rejected, because of its notion of dependence on cities. But that was premature, because it may be the case, and this is, of course, an open question, but it may be the case, that we could actually stabilize the social structure so that it has, as Alexander put it, the right kinds of power sharing characteristics so that it’s more like, you know, the Persian culture that has, you know, serious endurance rather than, say something very temporary, like, oh, I don’t know, the recent administration. And, in effect, there’s a phenomenology here to, you know, basically stabilizing culture in a way that actually, or the notion of finance and governance that actually works. And that that we can consider the notions of food or resource balance, and we can reconsider the notion of ecology balance and energy balance such that cities could actually work. 

Forrest Landry  1:09:39

I mean, I don’t see that there’s any fundamental technological reason, no physical principle that’s preventing us achieving a stable civilization on the basis of a stable city as a unit. Obviously, we’ve not done that currently, we’re quite terrible at thinking about large scale chronic problems and thinking about design in terms of centuries rather than minutes. But the notion here is that there’s nothing in principle that prevents that from being possible. But to really be able to do that, we need to go beyond thinking about it as a finance problem to understand that it is actually a governance problem, and that its manifestation isn’t going to be at the level of infrastructure, it’s going to go all the way back down to the level of culture in this dependency sequence that I named earlier. 

Forrest Landry  1:10:31

Moreover than that, we aren’t really actually going to be able to work on the cultural level unless we understand the ecology from which that culture itself arises. Right? So we are fundamentally social creatures. We’re tribal beings. We have, you know, a kind of interdependence upon one another as physical bodies to make clothing and to prepare food and to build houses and to do all the stuff that effectively creates all the modern conveniences within which we live and have clear dependence. You know, so in this sense, there’s a sort of reconciliation that is needed, in the sense of having a clear understanding of the dynamics of nature and how it manifests in the evolutionary model of the human being that we currently are. And so, in effect, by understanding the social dynamics of that, and the anthropological dynamics of that, that we could actually have influence on culture such that we do stabilize city and civilization. 

Forrest Landry  1:11:36

So in this particular sense, I do distinguish between civilization, city, and civility. I recognize culture as being very much about the the nature of communicative process, ie what the civility is. But in order to get that to work well, we really need to understand nature well from from from a kind of biological and psychological anthropological perspective. So in other words, to really look at the process of communication, going back to say Habermas, by which the reason and rationality of the group is moved from the mob rule or from the centralized totalitarianism to something which is genuinely distributed at the choice making level that isn’t unconscious, like current governance practices, or unconscious like current market practices. But is actually genuinely conscious in an integrative sense, such that the ecological perspective folds up from the substrate of the cultural phenomena into the outcome of the finance governance level. 

Forrest Landry  1:12:47

So rather than thinking about it as a finance or governance layer, I’m actually thinking about it as a new ecology, a sort of meta ecology that emerges out of the being or the essence of the city and the civilization. And so in effect, part of the reason why we as a species have not been effective at doing this sort of work previously, is because quite frankly, we haven’t really understood the notion of choice. We’ve gotten really good at understanding the notion of causation, we have had the Enlightenment, quote, unquote, and the sort of Industrial Revolution and the use of mathematics and technology and science to understand the world in a purely causal way. But all of those objective, outwardly focused things haven’t prepared us for understanding our own true nature. And religion hasn’t done that either. Because religion is no more better and understanding the nature of the subjective, and the nature of choice and the fundamental senses of which I’m referring, to go back to things like what is the essential human condition in the sense of cells, tissues and organs? You know, if we’re looking at trying to understand why people behave the way that they do? Sure, we could talk about it in terms of divine emanation. But I think to some extent, we really need to understand it in terms of things like neuro chemistry, and you know, has the person been exposed to trauma have they been exposed to essentially the right sort of nutrients so that they’re not feeling anxiety, because they’re missing something in their body that is required for them to feel comfortable in their skin?

Forrest Landry  1:14:18

So in this particular sense, I think that to a large extent, there is an entire field of study that has barely been touched on that is absolutely essential for the continuance of the species. Because we haven’t had to this particular point, really gotten that good at communicating. And so in effect, this exercise of Alexander and myself sitting together is an exercise of communication. Because we’re in a sense, comparing notes. I’m learning from what he’s explored and feeling validation, because he’s found that the things that I think are important are actually important. And I feel that there’s a very good likelihood that I’ve just had an influence on him and his thinking as well in saying all of this.

Alexander Bard  1:15:00

Yeah, I agree. The tribal mapping project that Soderqvist and I took was that we took our team around the world to different climate zones, the Arctic, the jungles everywhere, and studied in 17 different tribal communities that are still around. It’s like the last opportunity you have in history to study tribes that are not completely, you know, in the jeans and t-shirt Wi Fi mode. So it turns out that they were very similar. And then we just compare the data with contemporary human beings, we actually use data from millions of people, it turns out that people without knowing it subconsciously, are behaving exactly the same way. 

Alexander Bard  1:15:37

So when people feel safe at home and creative and at their best, that’s usually when they’re at their most tribal. This proves that, you know, the best evidence you could ever have against racism, for example. You could just get that out of the way to begin with, because we were the same species until at least 30,000 years ago, then split over a few continents and a few trade routes and things made us mix still. And that’s humans, that’s what humans are. In the tribal mapping project it was very clear that if you’re 19 years old, and you’re fully resolved, there’s very likely after a rite of passage that an old woman will come up to you and smack you in the face and tell you you’re nothing without her. So narcissism is not a problem in a tribe. In Africa they give them Iboga the young guys just to get them into shape so they bend down and then start serving the tribe. Which human beings ultimately want to do, we want to be contributive to some kind of tribal community. That’s fundamentally human. That’s why I’m against individualism. I think it was a terrible mistake to follow Descartes in the 17th century. I think one of the mistakes he made was a bad religion called individualism. And it came out of another pop religion called Christianity. I don’t agree with Forrest that we could just throw religion out the door, because they’re actually better alternatives. But the better alternatives were the ones we mostly ignored the last 3,000 years, but they’re out there we can go into that.

Forrest Landry  1:16:58

I’m not throwing religion out at all. I used the word religion in the sense that I think you are, which is, how groups of people come together and integrate. So the sense that you’re using it, I’m with you.

Alexander Bard  1:17:09

Yes. That’s the case. Exactly. It’s narratives again, it’s a narrative about the tribe itself. And those narratives are pathic, they’re logical, and they’re mythical, in different ways. But it’s all about priests essentially, have been trained forever, at least since civilization was born to try to create narratives that make people don’t kill each other. You know, can we prevent the war at least another week or so and maybe have, you know, a festival or something. That’s what priests have always tried to do at least when they’re good at it. And that’s what religion essentially is as well when different religions clash or different tribes clash. So we’re comfortable with a smaller form. Narcissism is not a problem in tribal community. 

Alexander Bard  1:17:49

So when does it become a problem? It becomes a problem with permanent settlement. We get written language. And because we get written language we can store information on a level we never did before. And we can learn from previous generations mistakes so we don’t have to make those mistakes again. This is Zoroastrianism. This is the Zoroaster 3,700 years ago, the first one of the major prophets, who basically declared that the Son could actually create a world that’s better than the Father’s.

Alexander Bard  1:18:14

Now Forrest might not agree with me and with Zoraaster, but that’s at least an idea. And it’s the only new idea ever. Because the idea before that was just that everything returns to the same. It’s called Hinduism still today. It’s called nomadology in our work. Nomadology is essentially the religion of the nomadic tribe: everything is always the same. You’re born and you live and you die, somebody is born, you live and you die, and it’s recycled, then you can reincarnate it or whatever you don’t care. It’s just an eternal Return of the same as Nietzsche said, and that was the religion until somebody 3,700 years ago came up with the idea that, well, the son’s world could be different from the father’s, for good or bad, asha or druj, in Persian, because we have more information available than the previous generation dead. 

Alexander Bard  1:19:00

Now that idea, easily transfers eventually to China and India. And this is essentially the birth of the East. And essentially we get a West in response to that, thanks to the Greeks and the Hebrews and the Phoenicians. And these cultures, especially the Western cultures, interestingly, took onto the idea of the event, Something can happen that changes history forever. It might be Armageddon, that’s what we call August 6, 1945 today. We live in the shade of the Armageddon. But it could also be that somebody can invent a technology that changes the world forever so that everybody in the world can communicate with everybody else, including finding out what is true and false in a way we were never able to do before, which hopefully is where the internet is heading eventually. 

Alexander Bard  1:19:43

So it’s in the city that crime occurs, because it’s much larger, and the connection between the old woman who smacks you in the face when you’re 19 has disappeared. Clearly, more than ever, it’s a problem that we have disconnected between generations. Everybody talks about wisdom all these days. All I’m saying is that wisdom is just long life. It’s just life experience. And especially if you have several people in long life experience around you, you have wisdom around you. That’s what wisdom is. So what happened in the cities was that trust disappeared. Anarchy was the result of that, cities were incredibly violent for a long time. They still attracted people in because there were trading posts, there was stuff to be made, you could climb the hierarchies, but they were very violent places. Until people started figuring out that we can actually control the anarchy. And the way you did that was through the law. And the law was written down. Thereby the law got the aura of being something sustainable or lasting forever, eternal. And you wouldn’t hurt it. 

Alexander Bard  1:20:45

We started interpreting nature as if it was full of laws as well, because we had laws between us. And it turned out, that at least crime went down, there was less violence, more trading, more copulation, populations grew, and production of food increased dramatically. And of course, that model worked in the sense that if we don’t disregard completely that it was exploitative because it wasn’t a major problem till industry came along, then at least it worked in the sense that it created larger populations than any other model did. Mesopotamia alone 4,000 years ago had half the world’s population. 

Alexander Bard  1:21:19

Now, that means the river valleys with large populations could create great armies, and these big armies could then fight everybody else. And except for a few nomads that had horses, up until the Mongols 600 years ago, they could cause havoc to these river valleys. The conflicts between the nomads on the steps, and the river valleys, lasted until the plague of the 14th century. After that, the cities could even have cannon boats and much bigger weaponry, and also started training soldiers to read, write and count, meaning they were better killing machines than ever. And finally, civilization won over the nomads. That’s history essentially. Now my proposal is basically why don’t we then go back and study how we live for 60,000 years in tribal communities. It was probably tough like hell, but that’s how we were shaped. That’s where our genetics were shaped. That’s called sociontology today, the study of the original tribe. 

Alexander Bard  1:22:17

Now we can compare that to civilization that has a written language period we call feudalism. It has a printed mass distributed language period we call capitalism. And now we’re leaving capitalism for a system that we call attentionalism. This is incredibly complex to explain. But just by just by giving you an example, like Google search, the ads are like desperados, right? We don’t push the ads. We hate ads more than ever. We hate capitalism. We go for attentionalism because attentionalism goes into the sacred realm of directly connecting human beings to one another without trading with one another. 

Alexander Bard  1:22:48

The problem is that everyone wants that attention. So now capitalism is trying to move into the most sacred realm of human existence. We’ll fight it back. We hate Mark Zuckerberg on Facebook. Instead of helping our children he employed 1000s of psychologists to turn our children into addicts. Another one of these boy pharaohs, evil guy, yeah. You know, any Christian republican woman in Texas today is an enemy of Google and Facebook. And I share them on because I think decentralization is what we desperately need. So that’s essentially–we can take from history, but we have to understand the specific predicament we’re in right now. A specific sort of hyper technological environment we’re in. So we cannot go back and just take something from 3,000 years ago and apply it on the world today and think it’s going to work no. Human psychology is something we can learn a lot about from history, but not the technological environment we’re in, it’s very, very specific and must be understood exactly that way.

Forrest Landry  1:23:45

I think in just general response to to all that. First of all, that was delightful to hear. And you touched on a number of points which brought a smile to my mind. So I think that maybe one way to sort of compare and contrast our approaches a little bit is, is that where you’re looking at history as kind of ways of understanding the human condition, I’m also doing that, but I’m actually doing it through the mediology of principles. So in other words, from this very abstract core work–so you mentioned the immanent metaphysics very early on–but what that gave me was a set of tools to identify principles very, very well. 

Forrest Landry  1:24:34

So this is where we move from the descriptive to the prescriptive side because I can look at history in a descriptive way. And I can say, okay, here are my hypotheses about what happened. And then I can test those hypotheses by looking at other historical events. And that works recursively reasonably well. I mean, we can learn some real things from that. The alternate method, which has been the one that I’ve been exploring more specifically, is to, starting from this underlying hyper geometric core, so to speak, to abstract a series of principles as they would be projected into the realm of the anthropological, the psychological the relationship between man machine and nature, as prior principles, and then to look to see whether those principles reify our understanding of history. 

Forrest Landry  1:25:23

So in other words, we can look at historical events and say, does this principle help us to understand what actually happened? And if it does, does that help us to understand the things that happened around it? And does it essentially increase the clarity of the narrative? Does it connect the narratives together? Does it deepen our insight. And so, to that extent, we would feel that, yeah, we actually had the right principles. We were effectively confirming the underlying geometry and topology of the relationships, both in the field of the actual events of history, but also the field of the principles that gave us the capacity to understand that history. 

Forrest Landry  1:26:04

So in a sense, we don’t have to come at history, just from a descriptive point of view anymore. We can essentially describe it using these different lenses, and then test the lenses out. And so in a sense there’s this process where now through this, we can effectively get some very strong evidential confirmation as to the specificity of the principles that are applied, and could be applied to new situations. So in effect, it’s a little bit like a kind of constitutional convention where the founding fathers got together and they said, okay, we studied all this mysticism, and this religion and we’ve had some really horrible experiences with mother England. And now we want to do things differently. And we’re going to basically do that by taking the best of what we believe to be the case, checks and balances, you know, three branches of government–

Alexander Bard  1:27:00

You know that was a Persian innovation?

Forrest Landry  1:27:03

Well, it could very well be, and–

Alexander Bard  1:27:04

It was the French that brought it to America. It was originally how the Persian Empire operated by the three centers of power.

Forrest Landry  1:27:10

Awesome. But the point is, is that somewhere in history, whether it came from Persia, you know, we could we could just roll the clock back, right? Somewhere along the way, when the Persians were coming up with these ideas, and they became reified in philosophers thinking subsequently. Each generation of philosophers have sort of worked with what they’ve learned from previous generations and projected into the current circumstances that they live. In this particular case, it’s a bit like I can stand outside of time. And I can look at even the very earliest works that the Persians developed, because they invented it, right? Maybe they got it from somewhere else, but somewhere in human history, it was invented, it was a evolutionary response, that some group of people or some individuals basically thought clearly and came up with some ideas about how to do things better, as you mentioned, there is a kind of progress that, that maybe the son can actually do better than the Father. 

Forrest Landry  1:28:14

And this idea here of what those concepts are, right, the three branches of government and the notion of checks and balances, those concepts themselves are practices that are projections of principles. And so we can codify them in a set of rules and laws and sort of heuristics of how to relate to one another, and economic systems and so on. You know, the notion of ownership. All of these heuristics are effectively rules that come out of practices that themselves came out of principles.

Alexander Bard  1:28:48

I can even point you to where it starts, if you like. According to legends, Zoroaster was a priest and Vishtaspa was a king, and they isolated themselves for 23 years to construct the first monotheistic religion. And they walked out of the door well prepared the way we wish we would be today and created the first Persian empire. And they realized the enormous capacity that was inherent to the fact that information was now written down and stored on a level they’d never seen before. And out of that came the first Persian empire and that idea. And power would then be split in three rather than two because with three if somebody becomes narcissistic as Tim pointed out, the other two would then unite against the third. But it also mimics really well how human beings operate. 

Alexander Bard  1:29:37

The genius of Zoroaster I think is the fact that human beings operate in three realms as Jacques Lacan, the psychoanalyst would say. They operate within the symbolic, the imaginary and the real. The real here should need citation marks because it’s not real like in reality. It’s more like the real that constantly surprises us. I would always give September 11 as an example of the real. We had a fantasy about how America and the West operated. And suddenly somebody came in and cut our two dicks off in New York City. And we were in shock state for the next 10 years doing all the mistakes they wanted us to do. Because we weren’t prepared. That’s the real. 

Alexander Bard  1:30:13

So the real power is actually the power over the actual resources, as Karl Marx would say, in a given society. That would have been capital up until now. It’s becoming attention, before then it was land ownership. The real assets, like mines and things like that. Then we have the imaginary, that’s what most people say, when you usually say who’s got the power. The person they point to who has the power, Biden, Trump or an old King of Sweden or whatever. That’s where the imaginary power resides. And the third one is the symbolic power. And that’s what we talk about here when we’re philosophers. That’s the narrative. Who’s in control of the narrative. It used to be the church in the past here in the West, then it was academia. And now academia is dying quickly too and we probably will have a new sort of digital real power, digital imaginary power, digital symbolic power, the way we had an urban nation, state, real and urban nation state, imaginary, urban nation state symbolic power. 

Alexander Bard  1:31:07

But what’s great about it, there is actually I would argue, a built in triangle here, all power is actually within our own minds, we ourselves are trying to locate our position in the world and try to identify who we are and the groups we live within, the communities we live in, also tried to find these three. So it’s actually quite helpful. The human beings probably from the original tribe had this idea there were three. 

Alexander Bard  1:31:33

It’s interesting to see, for example, if you study the Hebrew Exodus out of Egypt. In the original story, there was just Moses who was very likely an Egyptian guru or sect leader of some kind. This was an Egyptian sect that were disappointed with Akhenaten or something, but still wanted to believe in the one God, so they left Egypt, and eventually they rewrote their history. But after the Persian influence on the Hebrew culture, they rewrote the Exodus and had three siblings. There’s Moses, there’s Aaron, and there’s Miriam. And, of course, Moses is the Congress, Aaron is supposed to be the president, and Miriam is supposed to be the Supreme Court. So you’ve got these really really strong–thinking about principles here–you’ve got these really, really strong triads that constantly come back in history, and all the sustainable systems and reasonable systems that lasted a long time, where you could certainly work with principles like imploitation seem to have this character to them.

Forrest Landry  1:32:32

Well, I’m with you on all that. I guess my general observation is just that, if we first of all agree that triangles and the sort of triadic relationship, I mean, obviously, I’m sure you know, that my own work is based on that very directly. So in effect, these principles are going to emerge in multiple times and multiple people’s descriptions and so on. If we–

Alexander Bard  1:32:55

I should add here that Forrest is a dialectical genius. And when we do dialectics, you work with three. You do that all the time, it’s very good, yes.

Forrest Landry  1:33:06

Thank you. In regards to the triple of the relationship between man machine and nature, and insofar as prior to machine appearing on the scene, you know, the machine part of it really wasn’t evident/ I mean, there was system and of obviously, city is a kind of system, but the notion of machine effectively coming out of balance in relationship–right now, machine is becoming very dominant, in the sense that you mentioned already that many people are addicted to things like Facebook or social media in one form or another. So if we are effectively going to develop a new triadic infrastructure that has stabilization, because it is adhering to deep principles that actually holds in the structure of the universe itself. Then, you know, to the extent that we are dealing with new contexts, civilization and technology in particular is creating a new context, that the use of prior ways of thinking about those triads needs to in effect, be reapplied in the new context. 

Forrest Landry  1:34:16

And so to some extent, to do that, well, to have confidence that we have put together a model of how to do distributed power sharing checks and balances in the context of unusual manifestations of technology, we’re foing to need to know what those principles are and apply them in a far more conscious way than we ever have as a species previously. So in this particular sense, I see a great deal of value in the narrative and the notion of narrative and the notion of studying history and anthropology in these particular scenes. But I also feel that there’s a kind of limit. And the limit is that while we can be very very good at understanding the past, we need to be able to move beyond the past to imagine the future that we require. This goes back to the existential risk piece and pulls in some of the work of Snowden, for example, where he’s basically talking about safe to fail probes. 

Alexander Bard  1:35:22

Just a small passage here: the way we do it is that we split Nietzsche’s concept of the will to power. Nietzsche was basically sloppy, and we split the will to power into will to intelligence and will to transcendence. Will to intelligence is collecting all of history, all the data we could possibly have all the way up till now. Will to transcendence is imagining the future. So I totally agree with you on this one. That’s the terminology we use.

Forrest Landry  1:35:49

Okay, that makes sense. Just out of curiosity, was there a third split?

Alexander Bard  1:35:55

Not with the will to power, not necessarily, not that it makes sense. But that’s because we’re dealing here with phallus, and phallus here compared to matrix–where we come from where we’re going, psychoanalytically speaking–is called the two-headed phallus. The two-headed phallus is that for any group to be led by anybody, the leaders must be two, and they must not be connected to one another. One of them is the leader of the mind. One is the leader of the body. One is the Chieftan. One is the priest.  And the priest is essentially the character is the will to intelligence personified. So it’s a really, you know, great mind person who does that, who then respects and admires the best of the body people who’s then the Chieftan or the king. And that other power is the will to transcendence and we put the will to transcendence actually we put that on physical power itself and connection to nature.

Forrest Landry  1:36:49

I think this is again a place where I feel we may have diverged. Because there is a, you know, when we think about, again, safe to fail probes, that we only really get one chance. Right, with the level of technology that eventually there’s just a break, there’s breach. That in effect, I don’t feel, based upon the principles which I am now in possession of, that even the level of centralization associated with just two leaders or even two narratives–so, in effect, the the level of decentralization to which I’m interested in achieving is a kind of, a level of, a notion of social process that doesn’t actually have contingency on leadership or narrative at all.

Alexander Bard  1:37:40

No the point here is by teaching the first split, you can then go on splitting.

Forrest Landry  1:37:46

But it won’t. You see, the forces of technology and centralization are very strong. 

Alexander Bard  1:37:52

Yes. 

Forrest Landry  1:37:52

So in effect without creating a very good coherent model of how to do decentralized authority distribution or just a notion of resource distribution in a distributed way, right, in a way that doesn’t become corrupted to private interest, to the forces that would otherwise motivate people to think selfishly, to think individually. And I’m not saying that in a pejorative sense. I’m just saying that in a biological sense. If you look at, you know, however many millions of species there are, there are very few species that are actually social.  There’s little room for individuals of any species to think altruistically. But if we are going to create a kind of distributed choice making process, the net effect does have to actually be altruistic with respect to the ecosystem, that imploitative sense as actually occurring. So in that particular degree, there is–

Alexander Bard  1:38:51

We call that the Messianic.

Forrest Landry  1:38:55

Well when I hear the notion messianic, and this may be misunderstanding you, but I still think of a messiah as a person.

Alexander Bard  1:39:02

No, yeah. But it isn’t. That’s because you inherited from Judaism, the Messiah. The Messianic starts with Persians too. It’s called the Saoshyant. The Saoshyant is a function that steps into history under very spectacular or specific circumstances. Where the world to transcendence and the will to intelligence have to collaborate very tightly, because the unique situation demands it. That’s called Saoshyant in ancient Persian, and the Jews were so inspired by it, that they thought Osiris the great is a Saoshyant when he liberated the Jews out of Babylon. And he sponsored them to build a second temple. That then became Messiah named after Moses. The Messiah–the idea that one leader could lead the people. Yeah, for smaller people like the Jews, like a nation that’s absolutely possible. The messianic characters throughout history have led a certain people or something like that. But for the entire world which was Zoroaster’s idea in Persian, in Zoroastrianism, Zoroaster’s idea was that under specific circumstances, usually at the end of the Empire before the fall of the Empire, the Empire can be saved one last time and then go on for another 1000 years through a function called Saoshyant. And that’s what we call the Messianic and I think that’s where you’re heading here. So we’re aware of that.

Forrest Landry  1:40:24

Well, I’m, again, the will to power, the will to knowledge: the notion here is is that those who are experienced at an individual level, and I’m–

Alexander Bard  1:40:36

No, no, no. Nietzsche makes that mistake himself, yes. But that’s because Western thinking after Descartes and Kant is completely based on the idea of an individual. An individual philosopher who thinks about himself. He’s very narcissistic. I am totally against Western individualism. I have worked all the way through to go back through history, especially immersing myself in Chinese and Indian philosophy, where the idea of the Individual is ridiculous.

Forrest Landry  1:40:59

This is where I come into confusion because when I when I hear you speaking, I find myself encountering a lot of terminology that on one hand seems to indicate a clear awareness of needing to move beyond individualism. 

Alexander Bard  1:41:16

Yeah. 

Forrest Landry  1:41:17

But at the same time, I don’t see the sort of corresponding patterns that actually describe decentralization. So for instance, the notion of empire or the notion of narrative, are both, to me, highly centralized concepts, or concepts referring to something indelibly associated with a centralized methodology.

Alexander Bard  1:41:39

Yeah. That is why, when we know technology is going towards Empire, that’s also when we need to watch out carefully, because we are not. Human beings are not actually. And that’s why empires always fall apart in the sense that a dictator tries to control a large territory with a large population. At the end of the day, we are naturally decentralized. We are tribal. And that can play to our advantage when we’re looking at the current situation. But these tribes need to collaborate to create the sort of systemic causality you’re asking for, to solve the current problems. Does that makes sense?

Forrest Landry  1:42:15

Yes, it does. But I’m still wondering how you implement the compensation. So for instance, I’m not saying compensation in the sense of an economic exchange, but a compensation for the forces that prefer centralization. On a biological level, biology does really, really well, as you said, it naturally manifests Gaussian distributions, right? It naturally prefers decentralization. But when you get to anything to do with technology, or with causation or with human beings in social structures, such as cities, the push from an economic perspective, and from a psychodynamic perspective is always going to be very strongly in favor of centralization in one form or another. And so in a sense, there’s a need for us to–if we’re understanding the principles particularly well–to see the application of the principles at this layer specifically, because literally anything short of that just means that we’re continually caught in this cycle of reincarnating cities over and over again, until we basically blow them up so thoroughly, that there’s no Earth left for them to sit on.

Alexander Bard  1:43:25

Yeah, okay. My response to that is that these are dialectical processes, and the one we’re working with is called sensocracy. Internet of Things to satellites everywhere. So if we collect data through sensors everywhere, connected to our senses, we create a sensocracy. That’s where we’re heading with most things anyway. What’s called politics in the past will hopefully then be replaced by sensocracy. 

Alexander Bard  1:43:53

There are some people are very good ideas about how they want to establish a senscocracy. This time the Egyptian got there first: that is the Communist Chinese model. We don’t believe it is very sustainable in the long run, because we don’t believe boy pharoahs are a good idea. And by the way, we love something called freedom. Okay. So this is what we’re working on. And at least, the great thing is that once you centralize all the data about a certain system, as long as you distribute the data also back to players within that system. For example, you don’t mind giving away your data, as long as you give it away to five competing agents rather than to just one. In China, you give your data to one central computer, you can’t do anything about it unless you just get off the grid completely.

Alexander Bard  1:44:42

In systems where you have a lot of different competing agents that are provided with data. And that’s dependent on how much data we’re willing to give it away. Why? Because you will also have a much more honest and truthful view of the world when algorithms are free and open, organized in such a way to understand the world better. And I think it’s absolutely necessary. I think, to get out of the whole, you know, even controversy about ecology, there has to be more facts. And the more facts that are there, the harder it is to get around the arguments. And at the end of the day, that’s the only thing will convince humans to get on the move, even if they then are decentralised in the different communities, those communities can cooperate and operate in such a way that… We know for a fact that doing that results in this, for example. Well, that makes it a lot easier to motivate people. And that’s where I leave it. What I do is that I write a descriptive philosophy on this. Then I go on tour to engineering universities around Europe, and they’re packed. And then I just say to the engineers, I have a mission here. If you want to be part of the Messianic project, which is called saving the planet.

Forrest Landry  1:46:11

Unfortunately, there was parts of that that were a little hard to hear, because your internet signal gave out just briefly. I did manage to pick out all the pieces. I do believe, Tim, you caught that too, I believe yes.

Tim Adalin  1:46:24

Yeah, that’s right. I think if there’s a piece that requires clarification, it’s literally just the last couple sentences with respect to the particular phrasing of inviting participation in the Messianic project to engineers. So effectively, what’s of consideration here is, I suppose other particulars of that invitation with respect to the affordances of the architecture of that actual structure. And I don’t believe that has been entirely spoken to yet. And I believe that’s possibly where Forrest’s response is going to take us. And then Alexander, you can then clarify, I think,

Forrest Landry  1:47:05

Just checking with you: Tim’s summary sounds right to me, is that correct to you? 

Alexander Bard  1:47:10

Yes, we call it ecotopianism as a constructive response to environmentalism, to kickstart a dialectical process of creative thinking between the two poles.

Forrest Landry  1:47:20

Got it. I’m definitely of the motion that Tim was was recommending. So I feel for you a sense of arrival at this point. You say this is as far as we’ve gone with it. So in this particular sense, using the language of this as a position and to just give you a sense as to where I’ve gone. So to me, this is about halfway through my work. So from here, I’ve gone considerably farther. There’s like a whole series of subsequent stages that this has gone through. What I can report back about some of the later stages is that it turns out that there are there are certain risks in the methodology itself that need to be handled. 

Forrest Landry  1:48:08

So for example, with looking at sensocracy for example. Over the years, I’ve had people propose to me various sensocracy type models, where they would do sensor arrays to collect information about biological process, you know, cultural process or economic process. And in one way or another, these things would would filter back to a kind of triple net accounting or whole systems accounting or, you know, integrative thinking about how to essentially balance and distribute the flows of energy and particularly of atoms. Now, on a lot of levels, approaches like that sound really good. There’s a lot of goodness that can come out of those. However, in almost all of the descriptions of such things that I’ve heard or seen or had presented, and unfortunately, including yours, is that it doesn’t feel to me that there was a clear sense of how to deal with the inevitable fact that the information itself, as you mentioned, the attention aspects itself, that the economic forces are pointed at it so strongly.. as as you mentioned, the economy is ready to usurp the attention–what you referred to as the attention economy, I don’t remember the term you used–

Alexander Bard  1:49:31

No, I’m not. That’s Tristan Harris. I insist that attention cannot be an economy. Addiction can be, obviously. Addiction is the economy at the moment. There are all kinds of terrible new economies around. Attention is precisely concerned with what humans will not buy and sell. That’s what everybody’s after today. That’s why marketing is dying. Advertising is dying. We hate spam. I think spam filters and ad blockers together with Edward Snowden are the most messianic things we invented the last 20 years.

Forrest Landry  1:49:31

Your comments are relevant, but they’re not getting to what I’m trying to get to. So in other words, I agree with your response. But the thing I’m actually trying to get to is actually just in a different space. I appreciate your answer, but it’s not what I’m needing. What I’m looking for is, in a sensocracy, or in any system where data is collected, that there is forces that move against centralization, or the accumulation of power, or the reemergence of foci of control, that would effectively result in a destabilized system, but yet at a meta level. 

Forrest Landry  1:50:50

And so in effect, what I have done is, first of all, noticing that particular absence. So in other words, I’m basically saying, okay, where I arrived at was, this is actually a question that’s important. And what would we need to answer a question like this. What principles would need to be brought to bear? What is the basis of my skepticism? So for instance, I can articulate particularly, that part of the dynamic is that we’re using technology to try to address technological issues. So in one fashion or another, we’re effectively dealing with using complexity to try to contain complexity; using strategy, but strategy is itself a kind of technology. Using strategy to try to deal with strategy; using system to try to deal with system. 

Forrest Landry  1:51:38

And so first of all, we notice immediately that there is a distinction between complicated and complex, this goes back to Snowden again. And that, first of all, the complicated is never going to contain the complex. The complex will eventually exceed the complicated. I know you know what this is, right. But moreover, even if we were to live at just the level of using complex to deal with complex, right, so that we didn’t have just the naive case of trying to use blind system to deal with organic realities. That in effect, there’s still a fundamental emergence that’s happening within the organic reality of the complex, such that centralized control mechanisms and points of articulation are still going to be leveraged in weaponized ways. Corruption will still happen. 

Forrest Landry  1:52:36

So in effect there’s a phenomenology here that I can’t use complexity, or complicated to contain complicated, because no matter how I think about it, that ends up being an arms race, which is why most people try to go to artificial intelligence as a response. Because you know, that’s the preeminent.. it’s like, well, it’s gotten beyond the point at which we can deal with it as humans, maybe we can leverage the machines to deal with the machines, because they’ll understand themselves better. And it turns out that there’s no way to win such an arms race. It goes beyond the human and the human gets divested from the system, guaranteed every single time.  This comes down to a level of mathematical proof. There’s a level of rigor to this, that basically means that this entire trajectory of trying to use any form of strategy to deal with these things ultimately fails and cannot not fail. So this is part of the reason why I have skepticism about the approach. 

Alexander Bard  1:53:40

I love you for this Forrest. We might not so much disagree. It’s more that I’m operating with not only logos but with mythos and pathos. And my job partly as with the theater background, and as an anthropologist, is to create some hope here. In a way you’re darker than I am here. But it’s fascinating. This is certainly where you also will come into our next book, because we are obviously writing on sensocracy and we’re neutral about it. This could be a pharmakon again.

Forrest Landry  1:54:14

I can’t be neutral here. But what I can do, and this is the good thing–so at this level, it looks like I’m very dark–but the metaphysics has given a door.

Tim Adalin  1:54:25

There’s about eight hours to go here.

Forrest Landry  1:54:28

No, I’m not going to take that long. I’m going to take maybe five minutes. 

Alexander Bard  1:54:31

Haha.

Forrest Landry  1:54:31

So the thing is, is that it turns out that there’s another way. I don’t have to use complicated to deal with complicated. I can use clear. Clarity transcends complex. 

Alexander Bard  1:54:52

That is my difference between the will to transcendence and will to intelligence, because you’re basically saying you cannot take a will to intelligence, bump it up against another will to intelligence and another will to intelligence, and then try to solve a problem which requires a will to transcendence.

Forrest Landry  1:55:06

Yeah, that’s right. That game will never win.

Alexander Bard  1:55:09

No, exactly. I agree, actually. That’s why we’ve separated the two.  As we split Nietzsche’s will to power, because then we have two building blocks to build from, and they’re very different desires to begin with. Machines can only decide the will to intelligence, by the way. I can’t see a foreseeable future that any machine would have a will to transcendence.

Forrest Landry  1:55:30

We’re agreed here.

Alexander Bard  1:55:31

Yeah.

Forrest Landry  1:55:33

To basically connect to this a bit: if you basically have, as has been for practically all of history, the case that strategy manipulates culture with the use of vision, that in effect, you will just end up with a cessation. I mean, this is part of what I was trying to communicate in one or two of the emails that were going to the list. It is necessarily the case that culture infused by vision derives strategy. That there is a kind of transcendental design that emerges out of the culture, because of the clarity of the vision. 

Forrest Landry  1:56:26

And so in effect, as a result, I spend a lot of time thinking about and really working with the dynamics of the relationship between culture and vision, and mitigating against the effects of the relationship between strategy and culture. So, in this particular sense, there’s a kind of axiom two dynamic that is fundamentally necessary here. The human species has been going against this. If you look at the flow of culture as prior to strategy, you end up with with sanity. That’s your distributed decentralized power actually happening. But if you start with strategy as prior to culture, that is your centralized, complete miasma of dictatorial, centralized rule system that we both know can’t work. It can work short term, but the world burns as a result. 

Alexander Bard  1:57:32

Yes. 

Forrest Landry  1:57:33

And if you try to have vision lead culture, but the vision isn’t actually vision, that’s your mob rule, right? You can’t achieve decentralized vision in the sense that you actually accidentally achieve centralized strategy.

Alexander Bard  1:57:53

We call these the anoject and the hyperject, these two different types. Or authentic vision and false vision.

Forrest Landry  1:57:59

Well, I’m not sure about false vision, but I can point out that–

Alexander Bard  1:58:06

It’s something that pretends to vision but isn’t. 

Forrest Landry  1:58:08

Okay. A mob that is in effect in reaction to some stimulus, that’s false vision from my perspective.

Alexander Bard  1:58:17

Adolf Hitler. 

Forrest Landry  1:58:19

Yeah, perhaps. There’s a sense here that there needs to be a right relationship between vision, culture and strategy, and that this has not been achieved yet. And part of the nature of what makes that particular dynamic actually work is not so much about the data flows. And it turns out not to even be about the narratives, or about the leadership, creating the narratives. That in effect, you can dispense with leadership, and you can dispense with narrative almost altogether. If you come back to the ground truth of being. 

Forrest Landry  1:59:01

And by ground truth of being I’m talking about deeply having clarity about the nature of the human. This is coming back to the anthropology perspective, and so on and so forth. You know, we use the history and the tribal understandings of what you’re talking about brilliantly and, you know… we understand those things as essentially infusing our knowledge of the principles of what it is to be. But at that moment, once we have those principles, or once we, at least in this case, have tested those principles from a yet even more abstract source, that we’ve in a sense validated them. We’ve gone from the omniscient knowing, through the transcendent back into being. So in effect, our actual embodied state, and accounting for the psychology of how people are today in the context of technology, right?–if we don’t genuinely understand the reasons that people are addicted to Facebook, right now, or Reddit right now: if we don’t deeply understand the principles that created that as an outcome, both at the level of not just business dynamics and how engineers, you know, aspees, for example, may create something differently depending upon the kinds of encouragement they get or what their paycheck is. There’s a point here where we can go deeper than that sort of phenomena and look at what are the dynamics that emerge out of these these fields of relationship, that are either going to create things that are going to will towards centralization, or actually manifest an awareness of choice making that is genuinely that.

Alexander Bard  2:00:52

You just formulated the triad of vision, strategy and culture in itself.

Forrest Landry  2:00:57

Well, I did. And I did it specifically–

Alexander Bard  2:00:58

The imaginary is what you call vision here. The symbolic is what you call strategy here. And culture itself is the real, about the human condition itself. It’s brilliant. Yeah, we totally agree.

Forrest Landry  2:01:10

We’re using overlapping language–

Alexander Bard  2:01:13

Yeah, this is the direct result of working with these very, very simple starting points: man as the constant and the machine as the variable. We arrive in the same place. It’s beautiful.

Forrest Landry  2:01:26

There’s no question that the language overlaps because of the triadic nature, and because of the axiom three foundation. I see echoes of what would be axiom one formulations from time to time. What I have not seen is an understanding of axiom two. And that’s where the practice lives. So in effect, what I’m finding is that when I look at the methodologies that are being suggested, and so on, and so forth, I don’t see a clarity with respect to what the flow dynamics need to be. And so occasionally, just as often as not, it feels to me the flow that you’re describing is in the wrong direction. It’s random. Sometimes it’s in exactly the right alignment. And other times, it’s pointing the opposite way. And I just can’t tell. So what I’m basically saying is, is that at the place that you’ve arrived, where you see the correspondences between the language that I’m describing, and your own language that you’ve been working with, understand that underneath all of that is a much more coherent model than the fact of that underlying model being coherent as part of the reason why our languages are found to be correspondant at these points.

Alexander Bard  2:02:38

Yeah, we even use the same terms, vision strategy and culture are the terms we use as well. Independently of you, so there you go.

Forrest Landry  2:02:44

Great. But keep in mind that for me, that is a projection. Not your usage of them or my usage of them. But my usage of them is itself a projection of a deeper substrate. And that that substrate has some very definite things to say about the particular dynamics that are required for us to do as principles to have the right sort of practices. And this is where I’m really trying to bring a lot more clarity into the field. And unfortunately, it’s at a level of abstraction that’s just genuinely hard to do. But what I can definitely point to is, is that this is essentially kind of the next couple of chapters. It’s the next couple of layers that really want to come into focus, either in your work or in my work or in anybody’s work in order for us to have a chance as a species to deal with things like civilization collapse or design, existential risk, and the mitigation of the imbalances of the relationship between man machine and nature as currently occurring. See, to me these things aren’t just abstractions. They are projections of a deeper thing, which is–

Alexander Bard  2:03:51

I am a radical pragmatist. So I completely agree with you on that. So absolutely. 

Tim Adalin  2:03:58

Real quick, just for clarification, can I ask you Forrest to specify with respect to culture, strategy and vision, what the correspondences are to the modalities?

Forrest Landry  2:04:12

There’s a couple of ways that it can be done. In this particular sense, vision is treated as transcendent. Strategy is treated as omniscient. And culture is treated as immanent. And I described those in reverse order. But technically, it flows from culture, to strategy to vision, and vision re-infuses culture and it goes around that way. That’s the actual flow of it. I described this once earlier, when I was talking about culture being beneath infrastructure–culture would be immanent again, infrastructure would be omniscient, and then what we think of as finance would be transcendent. But it’s the sequencing that’s important. The sequencing allows us to know what the modalities are when we’re talking about practice. It’s the architecture of dependence and definition that allows us to know what the precedence is, if we’re thinking about in terms of theory or axiom one.

Alexander Bard  2:05:12

This is why I pointed out that Forrest is a brilliant dialectician. It’s very dialectical, his thinking. That’s why it works.

Tim Adalin  2:05:18

Yeah. So just a couple things. I’m sensing we’re moving towards a close here, Alexander, you might have been going to say something like that. I just want to reference back that the book Sand Talk is written by Tyson Yunkaporta, I forgot to mention that earlier. 

Tim Adalin  2:05:35

Otherwise, as far as the mappings go, I was on board with that. And when I said eight hours left, I didn’t mean for you to express what you had to express for us. But that there’s more to this conversation. Yesterday, I was actually re-reading some crucial chapters in Syntheism, where Alexander you speak about Atheos and your various Gods Syntheos, Pantheos, and Entheos. And there’s a quadrinity used there. And I’ve got a feeling that the articulation of the god Syntheos is going to illuminate for you Forrest a little bit more about Alexander’s sense of the flow. 

Tim Adalin  2:06:29

And for everybody listening, I wouldn’t take what I’m saying, of course, to stand for the truth of any of this, this is just my own my own readings and study. And as I said in the email, I think there is alignment, and I think there is also some difference in viewpoint, and that itself is worthwhile to put into a dialectical relationship. So it’s in that regard that I think there’s more to this conversation. Even just in formation of the triad there, I actually think there are again, two different angles that I was hearing you guys coming from, and where I see Alexander using the symbolic, imaginary and real, I heard that being mapped in a slightly different way. And I could be wrong about that. But basically, there’s something in there. There’s context in there to elaborate on. So anyway, that was really enjoyable. 

Forrest Landry  2:07:29

I’m comfortable with that particular triad, and I think the correspondences can be variable. I’m not fixed on the particular correspondences, aside from, you know, orientation questions that might set one to be preferred. I think that the thing that I’m trying to get back to is the question you’re asking about power.

Tim Adalin  2:07:53

Yeah I’ve been doing let’s say, a bit of imploitation in this conversation. So there’s a lot in me that has a bunch to express but it’s not the time right now. And it’s getting late for Alexander.

Alexander Bard  2:08:09

I’m just going to say, the transcendence in Forrest’s work applies to our imaginary. I’m not saying they’re identical, but they’re almost the same. You can see the structure here it works. 

Forrest Landry  2:08:20

That’s good by me.

Alexander Bard  2:08:22

Omniscience is the symbolic in our realm. And then the immanent and the real come together. So they come together, they’re obviously just different wordings for very similar things in the same sort of triadic structure. 

Forrest Landry  2:08:35

I’m actually really comfortable with that. You’re mapping makes sense to me, and would also be the way I would myself map it. And so yeah, we’re good here.

Alexander Bard  2:08:46

So we’re likely to see repetition from feudalist society, not feudal, feudalist society and repetition or capitalist society in the sense that you will have the real asset, which is obviously data in any kind today that creates power, immense power. The data then creates a sensocracy, so you have a new informationalist class, or a caste, if you like. Then you would have a sensocratic class that replaces what we call the political today. And then you would have a protopian class that replaces what we call academic circles today. So you would see again how these–because they’re different talents–the point here is that the reality is out there, but the different talents when it comes to transcendence, the different talents when it comes to omniscience, they’re different talents when it comes to to understanding and possessing the real. So it’s likely over time that we have the same sort of structure we had before. They were called monarchy, nobility, and priesthood, then we replaced those and we call them politics, the bourgeosie, and academia.

Forrest Landry  2:09:43

I think from here we diverge because I see a whole different way of going about all that, that doesn’t need to go through any of those stages. Basically at this particular point, it’s as if I take a right angle and go in a direction that doesn’t exist yet.

Alexander Bard  2:09:59

Okay, I’m just trying to be descriptive. That’s all I’m trying to do. 

Forrest Landry  2:10:02

Oh okay.

Alexander Bard  2:10:02

Basically it’s likely that these kinds of triads will appear again in history, because they’ve happened before. And when it comes to understanding the internet, since understanding attention is a whole new game, compared to understanding capital. Actually, capital is terrible at understanding attention. It requires new talents, it basically–when a new paradigm shift comes into history, it’s like a whole new level is suddenly there. And whether you’re located close to the center of the new map or not is completely irrelevant to whether you were located close to the center of the previous map. That’s exactly why we have the old institutions of our culture today are fighting all they can to fight down the new institutions that are trying to be born. And that’s exactly why we’re in a sort of apocalyptic anarchic state right now, it could go terrible, terribly wrong, especially considering the risks we have to deal with anyway,

Tim Adalin  2:10:52

If I can add here, what I’m hearing is that Alexander, in being descriptive is effectively looking to articulate and help us see how similar game-a type power dynamics are going to re-formulate themselves in the digital age. Forrest, you’re looking to invite us to see into a future that is at a right angle in your language from that that is not continuous with that in some important sense. It’s the more, to use the language of game-b. But what would be interesting, and I think necessary, is that we are in fact, in transition, in this very conversation, but also with respect to how this conversation meets the world at the moment. And part of the second question, sort of underlying this discussion asks us to effectively respond, given the current context of where we are now. 

Tim Adalin  2:11:49

I found it quite illuminating, for instance, to even think about such things, as you know, the stock market or the cryptocurrency trading market, or what’s currently going on with Facebook and governments through the lens of Alexander’s triadic structure of how it’s manifesting with the sensocrats, the informationalists, you know, if you’re on the point of that, and then the narrative that kind of goes into manipulating and formulating all of this, very, very helpful. So on the one hand, we need to become literate in the dynamics assuaging (orienting) us, but at the same time, maintain a kind of openness to actually be present with both. So it’s a challenging place to articulate. But I think we’ve done a pretty good job of that, actually. And there’s obviously this next part to come. 

Forrest Landry  2:12:44

Yeah, I definitely agree that we need to understand the current world really well. And to to really understand the current world well, we do need to go into the kinds of things that we are talking about. I feel also that to understand the next world or the future world well, or the world that we would want to live in, that we need to actually not just use the tools in the language of understanding the current world well. We do that, yes, we must have that, yes. But in the sense of the language that was described, the imagination isn’t going to be understood in terms of the symbolic. And I think, you know, when you said that you were really putting to point that the computers are never going to understand the future world that we need to live in. And so in effect, the transitional dynamic is not going to be created out of any amount of symbolic understanding or any amount of data understanding. What’s meaningful is something that is beyond that, and that it is fundamentally so. And so in effect, part of what I’m trying to do is to create a way of understanding that future that is a little bit agnostic about the language that we use today.

Tim Adalin  2:14:02

So is actually part of the jump, then, this change, the switching of where vision and strategy are with respect to their transcendent and omniscient mappings?

Forrest Landry  2:14:21

It’s the same thing. Part of the failure, and I think, you know, Alexander is pointing to this also, is that if we are in a sort of dualist perspective, and it collapses back to a kind of monist perspective, a sort of physical monism, where we think about things in terms of, say, causation only. That if you understand the world purely in terms of causation, you can’t even hold the notion that choice and change are distinct. Right? If I take, and I think in terms of space and time as being combined, I lose track of the fact that possibility is a different kind of vector than either time or space.

Alexander Bard  2:15:06

We call it hyper time.

Forrest Landry  2:15:09

I know and I remember you mentioned that and I tried to get into that a little bit on the email list, but it was just too tricky to try to do in writing. So I let it go.

Alexander Bard  2:15:20

Hyper time allows for a continuum to actually exist rather than just zeros and ones and discretions. It’s something that machines cannot grasp.

Forrest Landry  2:15:29

Well, I mean, you know, Hilbert space, for example, is an infinite dimensional space. And it’s defined by a kind of continuum. 

Alexander Bard  2:15:34

Yeah. 

Forrest Landry  2:15:35

A sort of dual continuum in the sense that it’s a continuum of two different orders, both within the dimensions and across them. I mean, you know fractal dimensionality is actually allowed for. So there’s a sense here that, you know, we can create awareness around the correspondences between our various works, but I think that part of the issue that we’re pointing to, in regards, Tim, your question, is that if we are living in the immanent it’s hard to understand that the transcendent and omniscient are distinct. If we’re living in the omniscient, if we treat the world from the omniscient it’s hard to believe or to understand that the transcendent and the immanent are distinct. And you can’t move from the omniscient into the transcendent unless you actually know and can work with the difference between the transcendent and the immanent. 

Forrest Landry  2:16:28

This goes back to the pre trans fallacy of Ken Wilber. If you’re in the prior state, you don’t even know that there’s such a thing as a post state. And going through the process to get you to the post state reveals things to you that are altogether different than what you could know in the prior state. So in this particular sense, when we’re talking to most philosophers, there’s literally zero awareness that the transcendent and the omniscient are distinct. That they need their own ways of working, that they have completely different methodologies. They conflate the notion of vision and strategy as if they were the same. Now I know that that’s not always the case. But I’m basically saying that–

Alexander Bard  2:17:10

This is what we call the two-headed phallus.

Forrest Landry  2:17:13

The two-headed phallus.

Alexander Bard  2:17:15

Yeah, the two-headed phallus. Yeah, the phallus with the two heads. 

Forrest Landry  2:17:18

Interesting. 

Alexander Bard  2:17:20

And they must admire one another without knowing each other at all. That’s what we call it, poetically.

Forrest Landry  2:17:26

So there’s clearly running into the same sort of dynamics. And I’m basically at this point, trying to give articulation of what’s on the far side of that. Now, I can try to do that in terms of the existing language, and the better I am at doing it in terms of the existing language, the worse you’re going to have an idea of what the future is. But the more I try to describe the future in terms of its own language, the harder it’s going to be to understand from where we are sitting today.

Alexander Bard  2:17:53

That is why we say that the past is best described with logos and the future must be understood with pathos. Pathos is a narrative to us. Pathic narrative.

Forrest Landry  2:18:05

I’ll have to think about that. I sense that there’s some elements of correspondence there. But there might be some nuances that are worth exploring, because I very much look at the future in terms of situations which are not narrative. It’s interesting. 

Alexander Bard  2:18:18

Yeah we call it pathic narratives. It’s not traditionally understood as narrative. We think everything is narrative, but we speak clearly about different narratives. And again, these narratives, logos, mythos and pathos also rhyme with your transcendence, omniscience and immanence as well. 

Forrest Landry  2:18:36

Back to you, Tim.

Tim Adalin  2:18:39

Well I think, just for my part in closing, there’s something profoundly important to the process of transforming intention, in that sense and an inner vision into speech that then becomes the propositions or things that can be traded back and forth, and in that sense become, maybe objects, we can then pass to each other. And it’s in the process of doing this, the process of giving voice then, in an immanent sense, but oriented from this deep attempt to communicate the souls image, the image in the soul, the image of the future, in this sense of vision. That I find at least that’s where I suppose would direct attention to. And for my part, the interest, the dedication to the ongoing nature of this conversation is actually to be found in building structures of invitations that enable a sacred participation in the artfulness of conversation. That is the temple in that sense, but the praxis, the participation in the temple, which enables the transition of this kind of age, something like this. 

Forrest Landry  2:20:12

It sounds like a great cliffhanger.

Tim Adalin  2:20:52

Awesome. Alexander, I know it’s getting late for you. So thank you both for joining and I suppose I’ll end the recording here.

Alexander Bard  2:21:01

Yeah. Love you both to bits.

Forrest Landry  2:21:04

Thank you so much.

More from the Voicecraft podcast

2021-08-21T12:19:18+10:00
Go to Top