What Comes After Post Modernism?

Horacio (aka vruz) pointed me to this 2006 essay in Philosophy Now by Alan Kirby on "the death of post-modernism."  I read it yesterday evening (on paper taking notes). Yes I am old school when reading anything over a page or two.

I am no expert in art, philosophy, and literature so the terms modernism and post-modernism don’t run deep in my brain. But from a simplistic point of view, I understand that modernism and post-modernism define the 20th century in western culture. Modernism first emerged in the late 19th century and was a reaction against the romanticism that had defined western civilization in the 19th century. Modernism embraced the industrialization of society and the emergence of breakthrough scientific thinking. In art and architecture, modernism brought simplicity and and new artistic forms.

Post-modernism was the post-war (WWII) reaction to modernism. It re-embraced historical contexts but in a modern form. Post-modernism was complex, ironic, and ambiguous.

So with that backdrop, what comes after the "modernist" era (which in my mind includes both modernism and post-modernism)? Kirby suggests a new ethos is emerging that he calls pseudo-modernism. I don’t like that word. But his observations ring true to me.

I believe there is more to this shift than a simple change in cultural fashion. The terms by which
authority, knowledge, selfhood, reality and time are conceived have been altered, suddenly and forever.
There is now a gulf between most lecturers and their students akin to the one which appeared in the late
1960s, but not for the same kind of reason. The shift from modernism to postmodernism did not stem from
any profound reformulation in the conditions of cultural production and reception; all that happened,
to rhetorically exaggerate, was that the kind of people who had once written Ulysses and To
the Lighthouse
wrote Pale Fire and The Bloody Chamber instead. But somewhere in
the late 1990s or early 2000s, the emergence of new technologies re-structured, violently and forever,
the nature of the author, the reader and the text, and the relationships between them.

And

the
culture we have now fetishises the
recipient of the text to the degree that they become a partial
or whole author of it. Optimists may see this as the democratisation of culture; pessimists will point
to the excruciating banality and vacuity of the cultural products thereby generated (at least so far).

And

Postmodernism conceived of contemporary culture as a spectacle before which the individual
sat powerless, and within which questions of the real were problematised. It therefore emphasised the
television or the cinema screen. Its successor, which I will call pseudo-modernism, makes the
individual’s action the necessary condition of the cultural product.


Kirby is right. We’ve moved into a new phase of society. One that emphasizes participation in culture and society and and technology and politics. If it weren’t such a mouthful, I’d suggest we call it participatism. Kirby makes a bunch of additional observations worth sharing.

pseudo-modern cultural products cannot and do not exist unless the individual intervenes
physically in them. Great Expectations will exist materially whether anyone reads it or not…..Big Brother on the other hand, to take a typical pseudo-modern
cultural text, would not exist materially if nobody phoned up to vote its contestants off.

and

Pseudo-modernism also includes
computer games, which similarly place the individual in a context where they invent the cultural content,
within pre-delineated limits. The content of each individual act of playing the game varies according
to the particular player.

and

The pseudo-modern cultural phenomenon par excellence is the internet. Its central act is
that of the individual clicking on his/her mouse to move through pages in a way which cannot be duplicated,
inventing a pathway through cultural products which has never existed before and never will again. This
is a far more intense engagement with the cultural process than anything literature can offer, and gives
the undeniable sense (or illusion) of the individual controlling, managing, running, making up his/her
involvement with the cultural product.

and

In all of this, the ‘viewer’ feels
powerful and is indeed necessary; the ‘author’ as traditionally understood is either relegated
to the status of the one who sets the parameters within which others operate, or becomes simply irrelevant,
unknown, sidelined; and the ‘text’ is characterised both by its hyper-ephemerality and by
its instability.

and

Much text messaging and emailing is vapid in comparison with what people of all educational
levels used to put into letters. A triteness, a shallowness dominates all. The pseudo-modern era, at
least so far, is a cultural desert.

and finally

There is a generation gap here, roughly separating people
born before and after 1980. Those born later might see their peers as free, autonomous, inventive, expressive,
dynamic, empowered, independent, their voices unique, raised and heard: postmodernism and everything
before it will by contrast seem elitist, dull, a distant and droning monologue which oppresses and occludes
them. Those born before 1980 may see, not the people, but contemporary texts which are alternately
violent, pornographic, unreal, trite, vapid, conformist, consumerist, meaningless and brainless (see
the drivel found, say, on some Wikipedia pages, or the lack of context on Ceefax). To them what came
before pseudo-modernism will increasingly seem a golden age of intelligence, creativity, rebellion and
authenticity.

That was a lot of quoting of someone else’s work. I fear not enough of you will click thru and read the essay so I’ve cut and pasted (is cut and pasting itself a fundamental part of participatism?) the best parts here so we can have a conversation about this essay. I think it’s an important discussion. On to the comments. We embrace participatism here on this blog.

#VC & Technology

Comments (Archived):

  1. scott crawford

    Thanks Fred. I need to click through and read and mull the whole thing (later, gotta run). But first blush, I keep hearing the hypenated words co-authorship and co-creation ringing loud and clear. Again, off for now, but very much appreciate the time and thought you put into this. Will return.

  2. Tom Cunniff

    Interesting, and merits further digestion and discussion. I think culturally, having the author (of a film, or book, or play, or ad, or blog, etc.) wink at us and ask “aren’t I clever?” has become stale. Perhaps ironic detachment is a useful form of rebellion only if one feels powerless to actually spark meaningful change. The important shift is that if one has the time, inclination and intellectual horsepower to engage in serious debate on issues, there is now a lot to learn and some small chance to influence the direction of the discussion. The author has identified and named this shift, but IMHO it’s far too early to condemn the new “texts” that have arisen from this in culture as being entirely vapid and valueless. Not every blog comment is meant to stand the test of time as a work of art, any more than everything we say in casual conversation is meant to be trandscribed and passed to posterity for future study. Sometimes talk is just talk.

    1. vruz

      Agreed, it’s possible it’s still too early to condemn.But the moment to decide whether we continue to produce Britney Spears grade of cultural products is RIGHT NOW.

  3. Amanda Chapel

    Pseudo-modernism isn’t a movement; it’s anti-movement. By making everything equally valuable, nothing has value. The result is “a weightless nowhere of silent autism.”The questions are: What are we sacrificing? Why do we expect value to emerge from an anti-system that works to constantly reduce value?For example, as the Web2 evangelists work hard to de-professionalize business, why on earth do we expect there to be business if they succeed? If you de-formalize the value chain, don’t you ultimately, consequently, produce shit?I’ve said to you before Fred: the levy broke and we stand in awe of the abundance of water. That’s silly… stupid… dangerous.

    1. fredwilson

      AmandaWho are you? Are you a real person?

      1. Amanda Chapel

        As real as you.

        1. Amanda Chapel

          Now how ’bout answering my questions.

          1. gregorylent

            this should be a separate, and deep, conversation … values win in the end because they are based on the way nature works, and the way consciousness works … when foolishness prevails, suffering grows … in india it is called dharma, and with that concept it is easy to see the ridiculousness of much of modern american life ….. economists might say it is not sustainable, and that is the least of it

          2. zburt

            if you wanted your questions answered, you would write a legible post.top executives no longer dictate value. unless you deem them to all have exquisite taste..

    2. dick costolo

      I don’t really get the line “as the web2 evangelists work hard to de-professionalize business….”. What we are seeing is the removal or reduction of friction from value chains. That doesn’t reduce value, that facilitates the creation of value. You can say this de-formalizes value chains, but who cares, because why would this lead, ipso facto, to the production of crap? Ebay removed the friction of geography in the economy of used goods. That created enormous new value. Who cares if ebay “de-professionalized the sale of physical goods and de-formalized the value chain”. Everybody’s comments here are the result of the reduction of friction of being a ‘publisher’. Your very comments here contribute to the thesis that participatory content CAN be more valuable than high friction, high-cost, broadcast content. There is NO QUESTION that participatism increases the ratio of noise to signal; however, it also increases the likelihood that important content will see the light of day, and there will be enormous value created by the companies that understand how to separate the wheat from the chaff.Finally, since there’s no law that say Paramount pictures has to let me edit scene 4 in all their summer movies, the market will decide what percent of the market professional and formally produced content will have over more socially produced content. So, what’s the trouble?

  4. alan p

    Egads – Modernism 2.0 ;). So do I write my own blog post or comment here…..Darn – I will just have to write something on my blog – here is the stub:Still, the fact that this was flying around Twitter last night (I read the article there after watching an exchange between @vruz and @amandachappel) at least shows Twitter’s gone up another notch in its content level!

  5. alan p

    Oops – stub is here – except I’ve not written anything in it yet. It only exists in the mind of the observer….http://broadstuff.com/archi…Beat that for modernism 2.0 🙂

    1. fredwilson

      HmmFlow is silent autism?I am going to need to noodle on that one

      1. alan p

        It hit me when I read his definition of silent autism… I thought it was interesting, and a tad provocative too of course 😉

  6. angusprune

    Again, I’ve not read the whole article yet, but I think that Google’s alogrithm is probably the route of all this – either spiritually, or simply as the first manifestation of this new ethos.Just passively linking to something was suddenly given a concrete affect on that object – by increasing its page rank. As things have moved on, simply clicking on a google result will potentially affect its ranking. Services like digg and stumbleupon have started to take this further by giving you a ‘one click’ influence. The next step will be ‘no click’ influence – the very fact that you have even viewed something will have some concrete affect on that material.This move is replacing (in some areas) the currency of money with the currency of attention – Consumers voting with their time, instead of their money. Services like last.fm show you a glimpse of this – with top tens based on attention rather than sales.When we move to some kind of subscription, or play based model of music pricing (inevitable one way or another) it will change the music industry more radically than people think. For instance, manufactured pop tends to sell allot of records in a short time, and is then forgotten. Longevity doesn’t matter, the records are already sold; but based on attention the shift will move away from flash-in-the-pan acts to more authentic act with a longer lasting appeal.As to what to call this new era – I don’t know, but pseudo-modernism feels too short sighted. I think that the changes are going to be too significant and far reaching to be lumped into the modernist camp. We’re not only starting a new camp, we’re moving to a different forest.

    1. fredwilson

      AgreedPseudo anything is the wrong term for something so real as this movement

      1. Amanda Chapel

        Angusprune, replacing “the currency of money with the currency of attention,” indeed. Here’s a analogy: Bethlehem, Pa. is replacing the former Bethlehem Steel Corp. with a casino. Okay? Do you see the difference.And Fred, again, you seem to be in awe watching the flood rather that evaluating the consequences. The “attention” economy like the casino above DOES NOT MAKE ANYTHING VALUABLE! That’s what is meant by “pseudo.”

        1. fredwilson

          why does something have to be physical and tangible to have value?

        2. Mark 'Rizzn' Hopkins

          To say that the casino doesn’t make anything valuable is in itself shortsighted, Amanda. It makes entertainment, which by looking at the US economy, is one of the most important chunks of the whole mess. Forgive me if I’m wrong, but aren’t you in Marketing and Communications? Point to me what hard and concrete thing you create of value to your customers. Can you hold it in your hand? Is it real? Does that make it less valuable? No.Money, as in currency, is worthless. Why do you think Bill Gates spent the first half of his life amassing his wealth and pledged to spend the second half giving it away? Because after a certain point, unless you’re gold plating your commodes, you just can’t spend it all in a way that adds value. Given he has a conscience (contrary to my Linux-using friends’ popularly held beliefs), he decided that the best way to improve his quality of living would be to improve the world through charitable giving.Speaking more directly to your criticisms of the Web 2.0’s efforts to ‘deprofessionalize business,’ it makes the world more of a meritocracy. You put in the work and you have the ability, you get ahead – the market deems you more valuable. If you sit on your duff and your high-falutin’ degree and collect a check, you get left behind.Moreover, it de-centralizes business. In terms of efficiency of the organization, smaller focused companies rule the day. You don’t need a team of thousands to put something in the hands of every man woman and child anymore. You can simply assemble a team of two, five, or ten and create something valuable enough that it will sustain itself as a business and the proprietors of that business in perpetuity. The interactive nature of virtual goods and service creation means that what you invent doesn’t always mean what ends up in the hands of the consumer – and that’s fine, because the engagement is the loyalty is the product is the value.In short, the de-centralization,removal of hierarchy, and dis-incentives to elitism serves to broaden each of our slices of the pie by use of the free market without implementation of socialist policies of punishing achievement and rewarding slack. It functions philosophically on the same tenants that built the Internet itself and has served to make it the powerhouse it is.

  7. alexandrosM

    Excellent blog post, very insightful observations. One point i disagree with (and therefore have something to add to this conversation) is the following:”The pseudo-modern cultural phenomenon par excellence is the internet. Its central act is that of the individual clicking on his/her mouse to move through pages in a way which cannot be duplicated, inventing a pathway through cultural products which has never existed before and never will again.”Could not the same be said about someone flicking through various books in the bookshop? This is not a new type of experience, static websites containing text are simply a part of the older worldview.On the other hand, big brother, digg, wikipedia style experiences cannot indeed exist without their audience. In this context, the author of wikipedia is not the person who puts the content in place, but the person who sets the rules. If you think this author is “irrelevant, unknown, sidelined”, there must be some kind of misunderstanding, because this author is essentially still the god of his universe. It is just that this ultimate authority has decided that instead of authoring every single bit of the experience, they can move to meta-authoring, that is authoring the environment within which the participants will author. And this is just as vital. In fact, it makes possible work on a scale previously impossible (wikipedia).In my mind, the questions that arise are: who is going to be the first product/website/community owner that goes one step further and gives up its meta-authoring rights to the participants? will they retreat to a meta-meta-level? Is it possible to create a community without an active author, where the author will simply control the starting point?

    1. fredwilson

      Hmm that last question is interesting but I am sure someone has alreadytried it

      1. gregorylent

        the conversation is (ok, will be) the blog

      2. jaredran

        Wouldn’t any of the online aggregators (i.e. topix, outside.in) apply to this meta-meta construction? Those sites don’t author anything except the organization of the content (much of which is automated) with little editorial voice coming from the Site Owner/Author.

        1. alexandrosM

          If they are anything like techmeme, then the determination of the automation/algorithm is the whole meta-authoring. I was thinking more along the lines of a digg where the users get to determine the ‘algorithm’. Think about it: with traditional news outlets, people complain about editors controling the content. Then social news sites come and all is nice for a while until people realize that its all about the algorithm and therefore the pressure focuses on the algorithm itself.

    2. vruz

      this meta-meta-thing you propose already exists and has existed for decades now, and it’s called “open source”.also, the author explicitly addresses your bookshop experience argument, he acknowledges participation is not new, but it was never before a *required* ingredient of cultural products.

      1. alexandrosM

        open source is indeed a good example, but even this seems to adhere to some structure. Maybe the licences (GPL, BSD etc.) can be thought of as the framework within which the work is done. IP laws also play a significant role. I do not have a complete answer yet though.At the end of the day, reality itself is bound by laws (of nature at least) and we all participate in its evolution. So it really would depend on how a ‘product’ is defined. Is wikipedia a product? Is a single page a product? maybe a single edit or comment or even a thought? Is open source a product? maybe the entire internet or the world itself. It really depends on how big or small you want to go.Perhaps what is new about this era is that the structure of these systems has been exploded. We can now see an encyclopedia form from first draft to final version, all the revision history and discussions and input from each contributor etc etc. Previously, the book woud simply appear in print, giving the illusion of a single piece of work when in fact it too was composed from smaller elements, a process that was only transparent to its author.

      2. mastermark

        @vruz: But that’s not true, either (that it was never *required*). Arguably, all “performance” art is participatory, and always has been, just in varying degrees. Again — it’s about technology as an enabler. So if you compare participating in a performance art act from the audience vs. in one of the acts cited by Kirby (Big Brother on the tube, any given Internet forum, etc.), the technological differences enable behavioural ones. Anonymity being a key difference, and one which obviously *encourages* participation.The argument that any of this is new is specious. The only valid argument (using the technology as enabler tack) is that the degree and intensity is new.Having said that: +1 on open source.

        1. vruz

          all performance art may be participatory in the sense of the immediateness and immersiveness of the spectator of the work of art.performance art still has an author, a choreographer or whatever is the name you want to give to the initiatior. Just as of Joyce’s Ulysses pre-figures postmodernism, certain works of performance art may have pre-figured pseudomodernism.What do you think ?

          1. mastermark

            Well, I’m arguing that there is no such thing as “pseudomodernism” – so I suppose I have to disagree. 😉 As Steve Kane pointed out in his comment on Ways of Seeing, the creation of meaning has always been a participatory act, with all forms of art (arguably, in all semantic acts, ala Wittgenstein). Do the new technologies change that? Nah. Do they make new semantic forms possible, for which we do not yet have adequate critical tools? Perhaps. Who’s the “author” of what I’m writing (and you’re reading)? Fred Wilson? Kirby? You? Me?Does it matter? The point is — it’s not new, it’s just an extension / amplification of what the species has always done (and been) — technology being the extender / amplifier. It’s as good (or bad) as it’s ever been, or ever will be. Sturgeon’s Law (90% of everything is crap) has *always* been true, and it always will be. Technologies like the Net or cell phones, or reality TV shows, haven’t changed that — they’ve just made people like Kirby more aware of the 90%.You ask a different question from Kirby — you want to know if these technologies can enable more good. I suppose I’m trying to articulate that they will (most likely) enable just as much good (and bad) as any other tool ever has — see Sturgeon’s Law again. People get excited about new things, and have utopian fantasies about them — that’s fine. But in your posts (and in Kirby’s article) I sense a belief that these tools are generating more bad than has been the case in the past. I don’t think that’s true. Sturgeon’s Law works in the converse — and that 10% is quite reliable.

          2. vruz

            Sturgeon’s Law is an adage in a work of fiction, not a natural law by any means. He didn’t even call it that.I think that 90% of crap is a self-fulfilling prophecy, just as Moore’s Law can’t prevent Intel from missing their goals… it’s still a standard Intel tries hard to meet.Same thing for Sturgeon’s Revelation.If a 90% of crap is the standard you want to meet, 10% is the room you are leaving left for non-crap.Is that optimal ? is that good enough ?Is it possible to do better ?You are not answering any of those questions, you are giving me dogma, and that won’t cut it.Moreover I have analysed statistics where certain user-generated content virtual worlds, for a consistent 3 years violate that law by a margin of 20%.Your belief for some cases runs contrary to reality.You are probably right about Kirby’s limited awareness of pop culture and that his sensibilities are very obviously hurt the most by some of the worst cultural products mankind has ever produced.You are right in that Kirby is an extreme pessimist in this regard, but I consider myself to have a relatively informed view of pop culture, considerably informed about technology and a past life of varied interests in literature and other arts.I think my point of view is balanced enough to appreciate your point of view is at the exact antipodes of Kirby, I think that for what you have expressed here you are an extreme optimistYou can’t possibly sense a belief in me, because my current state is of doubt and curiosity.I can understand the philosophy of laissez-faire is a widespread one, but you can’t reasonably propose that’s the only possible, uniquely correct point of view.

          3. mastermark

            Ah, you misunderstand me, but that’s my fault for being unclear. First off, Sturgeon’s Law (or Revelation (true, but a distinction not worth making, IMO)) is just a pop culture way of expressing a Pareto distribution. It doesn’t really matter whether the numbers are 90/10, 80/20 or 50/50. I don’t believe in any numbers — all I am asserting is that these technologies will not change the existing distribution. Whatever the numbers were yesterday, that’s what they’re most likely going to be tomorrow. While I find your (implied) hope that these technologies could be used to change those numbers for the better impressive, I am as sceptical of that as I am of Kirby’s assertion that they have changed them for the worse.It’s hard for me to understand how you could see that as being “optimistic”, let alone “extreme”. 😉 Hence my assumption that you’ve misunderstood me. As for sensing a belief in you, I certainly did infer one (and I don’t think my inference was unlikely, given your other statements). But that inference was wrong, and you have clarified it.Funny how that creation of meaning stuff works. Hard for me to see how this “participatism” stuff makes that *worse*. ;D

  8. jonathanpberger

    > (is cut and pasting itself a fundamental part of participatism?)Pastiche (aka cut’n’paste) is a characteristic technique of post-modernism, so in that regard yr right on track.

    1. fredwilson

      Pastiche ­ what a great name for a tumblog

  9. awilensky

    Anyone that quotes or writes the word ‘vacuity’ is alright by me. I catch some TV shows like, “are you smarter than a 5th grader’. and these kids that mostly compete against 25 year old models and celebs are really smart.

  10. SamJacobs

    Thank you for reposting, Fred.This feels like such a fertile period in our culture/race’s evolution.Amanda’s comment that this is an anti-movement I think misses the point. Putting a perjorative on it *feels* a bit cliche to me. We sound like our parents.In a society where everyone has a voice, a bunch of different things seem to happen. The proliferation of voices *probably* commmodotizes any individual voice on the margin and the cultural dialogue shifts from a lecture to a conversation. That will understandably feel weird for people used to lecturing and/or people that have felt comfortable within the old structures and contexts.Using the music industry as an example, maybe it means fewer superstars. But, to the point of all the music being created, it maybe means also that there is an articulation of a community where before there wasn’t. Micro-communities where smaller groups are having more conversations among themselves and “culture” as defined by a broad set of values, characteristics, likes, dislikes, etc. becomes fragmented and disaggregated.That’s all pretty obvious but kind of interesting. I’m used to defining myself in the context of a broad pursuit with clearly defined mileposts and goals but I think those definitions are shifting. Feels healthy though.

    1. chartreuse

      I see it as a powershift (to quote Toffler).People are beginning to realize their individual power which shifts everything. And these changes don’t just happen in a vacuum. Or just on the internet. The same reason sites like Digg and Last.fm expand is the same reason our art is getting cleaner and gays will eventually be able to marry everywhere. The powershift to individuals taking more control of their lives.Of course there is a downside to all this participatism (much better word than psuedo-modern, btw). Small groups (niches) clamoring for influence could lead to more violence (see France). The lack of cultural touchstones could lead to the denigration of the Nation/State (see America).But no matter how it all ends up the fact of the matter is that it’s happening. And we can’t stop it.

      1. dick costolo

        +1 for participatism

      2. vruz

        “paticipatism” focuses on the mechanic, the mere cog in the machine.”pseudo-modernism” –whilst it may have a negative connotation in daily street speak,– does not have a negative intention from the academic point of view.It’s simply stating the faux sense of being in command, of being capable of actually making a change, when in reality the vast majority of the population can’t make any.the only participation exists within the realms of the sandbox you’ve been given, your particular corner of a Truman Show.”participatism” gives up a prori on the possibility of making a real change.you participate, you feel good, be a cog in our machine, stay in faux-reality, don’t ask for more.I propose something different: I propose let’s make it real.

      3. tweetip

        chartreuse says “But no matter how it all ends up the fact of the matter is that it’s happening. And we can’t stop it.”In that this movement of participatism depends on light flowing over fiber, any entity who focuses on stopping it can stop it – as Verizon & gang did with Usenet.Tomorrow, it will be Google…

  11. eric susch

    I’m not sure people or culture have changed as much as the article suggests. The thing that skews our vision of the past is that everything was pre-filtered by technology (access to a printing press for example) industry (gatekeepers like publishers) or time (who the heck is going to hang on to anyones “letters” unless there’s something important in them.)Today it’s different. Because of digital technology everybody has access to everything. Consider every home movie ever made. Previously a few in the family would suffer through it once then never watch it again. Today it’s on YouTube for the world to watch over and over and over… and then comment on how crappy it is. Does that mean more crappy home movies are being made? Possibly, but there’s a LOT more good stuff to. You just have to search through everything else to find it.Cultural movements still exist too. They’re just not “universal” and in your face. It’s all still there. You just have to dig beneath the surface and find it.

  12. gregorylent

    technology has enabled banality … but “pseudo-modernism” is an unfortunate label and indicates that the author cannot see beyond the pastso many shifts are going on in this time, all of them having to do with creation of a greater understanding of what value is, of what a human being is, of what life on earth isone would be that we are shifting from quantitative to qualitative valuations, and it is not surprising that those who don’t get this, or don’t understand this, are clutching on to metrics, monetization, and the commodification of experience … they are to be ignored, and life is ignoring themcause and effect are not what they seemthe author of that paper is in some ways exactly what he is bemoaning, offering nothing in terms of progress, only giving a retro view

    1. mastermark

      While there are some interesting ideas in Kirby’s post, it is also deeply flawed by specious arguments. Consider the following:”A culture based on these things can have no memory – certainly not the burdensome sense of a preceding cultural inheritance which informed modernism and postmodernism.”True – but: who says that our culture (emerging or current) is or will be based *solely* on these things? One of the defining differences between post- and modernism was the insistence of post-modernism that it simultaneously replaced *and* sustained modernism. Kirby implies a binary state here, where that which has or will come entirely replaces and supersedes all that was before it. There is no justification for this argument, nor any reason to believe it. Both modernism and post-modernism are very much alive. Consider:”The occasional metafictional or self-conscious text will appear, to widespread indifference – like Bret Easton Ellis’ Lunar Park – but then modernist novels, now long forgotten, were still being written into the 1950s and 60s. The only place where the postmodern is extant is in children’s cartoons like Shrek and The Incredibles, as a sop to parents obliged to sit through them with their toddlers.”Nonsense. This dude clearly watches too little television. The Wire, The Sopranos, Six Feet Under, Deadwood, Lost, Battlestar Galactica, 24 — just to rattle a few off the top of my head. Post-modernism screams from the little screen on a daily basis. Or this:”In music, the pseudo-modern supersedingof the artist-dominated album as monolithic text by the downloading and mix-and-matching of individual tracks on to an iPod, selected by the listener, was certainly prefigured by the music fan’s creation of compilation tapes a generation ago. But a shift has occurred, in that what was a marginal pastime of the fan has become the dominant and definitive way of consuming music, rendering the idea of the album as a coherent work of art, a body of integrated meaning, obsolete.”Bah. Nonsense. Who’s guilty of a lack of awareness and understanding of the past now? The “album” was a commercial construct, derived from manufacturing and distribution constraints, and only used in the coherent way referred to here by artists post-60’s and 70’s, as a way of adapting their art to this medium. In fact, the vast majority of the history of *popular* music (excluding classical and opera) is a narrative (ooh! arch your eyebrow with me now!) of the *track*, stretching back into the mists of recorded history. Bards, troubadours, minstrels, in a pattern seen to be remarkably stable across ancient cultures (Western, Eastern, African and Native American), sang *tracks*. If this sort of specious reasoning is exemplary of the amount of rigour that’s gone into Kirby’s essay, the post-post-modernist in me thinks “Dude, WTF?”Indeed, I think it’s interesting that Kirby keeps making these sorts of binary arguments. Consider:”Secondly, whereas postmodernism favoured the ironic, the knowing and the playful, with their allusions to knowledge, history and ambivalence, pseudo-modernism’s typical intellectual states are ignorance, fanaticism and anxiety: Bush, Blair, Bin Laden, Le Pen and their like on one side, and the more numerous but less powerful masses on the other.”Please. This (and the whole argument about post-9/11 politics that follows it) is just silly. Where, one wonders, were the ironic, knowing and playful governments *prior* to 9/11? I must have missed those, in the last 50 years, hiding under a rock with my dog-eared copy of “Foccault’s Pendulum”. Lol. People making binary arguments tend to fall into two groups (ooh! arch it with me again! ;)): those compelled by fear (and it’s partner, anger), and those wishing to manipulate others by invoking same. I hear a lot of the former in Kirby’s arguments, and wonder about the latter in the comments of meta-creatures like Amanda Chapel.To paraphrase Winston Churchill: get over it.

      1. dick costolo

        I’m totally stealing the line….”people making binary arguments tend to fall into two groups” . That is absolutely outstanding.

        1. mastermark

          ;D

        2. Mark 'Rizzn' Hopkins

          There are 10 types of people in this world. Those that understand binary, and those that don’t.

    2. mastermark

      Hey, FWIW, I didn’t mean to reply to your comment, particularly — just stupid, and clicked reply in the wrong place. Mea culpa. I agree with most of what you say, here and elsewhere in these comments. However, it’s worth pointing out (and I suspect you would agree) that the statement “technology has enabled banality” can be reduced to “technology has enabled X”, where X can be pretty much anything. That’s the point about technology: it’s an enabler. Kirby is asking us to accept a moral judgement about a tool that he is using to make said argument, and seems largely unaware of the irony of that.

      1. gregorylent

        yes, thought you clicked the wrong reply, though sometimes disqus has a mind of its own…agree with your “enables x” … i like the way vo nguyen giap said it in 1967, about the west, “ah, your computers serve only to render your ignorance more efficiently”the thing disappointing for me about kirby, he seems to have no context within which to place his understanding, and no idea what any of it implies about human beings …. other than that, he is wonderfully articulate … by the way, your blog has some interesting topics as well

  13. wmfischer

    “The culture we have now fetishises the recipient of the text to the degree that they become a partial or whole author of it.””Postmodernism conceived of contemporary culture as a spectacle before which the individual sat powerless….”Here’s a marxist reading (i’m no marxist, just employing some of his methodology and implications that economics drive cultural developments). Perhaps, the shift that Kirby is alluding to is one of patronage. The internet and other more recent mediums allow for a broadening of the base of those who commission cultural artifacts. And just as patrons of art have traditionally wanted to place their imprimatur upon art through engagement with the artist, this process is holding constant while the number of patrons is expanding.This democratization removes some of the traditional cultural authorities and continues the postmodern dialog of “authorship.” Whether it’s etsy, mash-ups, or youtube, one can see that by putting the means of production and the capital to commission into the hands of a considerably larger population that a different type of art emerges.

    1. vruz

      interesting point of view. someone else on this thread mentioned the BBC TV Series “Ways of Seeing” (which also spawned a book)”Ways of Seeing” towards the end of the series talks about precisely this. It was published in the 80s and was revised in the 90s to accomodate new developements in the advertising world, it didn’t include commentary of ‘Social Media’ or ‘Web 2.0’ for obvious reasons, but the author did ponder on this possible marxist reading you refer to.In a world of micropayments everyone is a micropatron.

  14. zburt

    to samjacobs and wmfischer:while on the one hand we might see a diffusion of taste and consumption, this may alternatively on the other hand lead to a consolidation of taste, driven by a new authority. the new autonomy of the individual is laughable; although the individual has new variety of ways to access his content, he is driven by a new groupthink.authority has transferred from the editor of the newspaper to the man in alabama i don’t know who pushed that story to the top of digg. a new dangerous tyranny of the majority may be emerging; hark!

  15. scottythebody

    I can’t find it right now, but I think Momus wrote a very good essay about this movement that related to sampling, participatory writing, etc. called “Copy & Paste Culture”. Unfortunately, this “movement” has since had its name taken over (at least on Google) by discussion of plagarism.These ideas were looming large in my mind back when the “Gray Album” came out (the Dangermous/Jay-Z/White Album) mash-up. While all of this had been brewing for a while, I consider that album to be the first great “work of art” from the movement. it also had the additional function of producing the first net-only “platinum” record, thereby validating the total irrelevence of the recording industry.There’s definitely *something* there: weblogs, mash-ups (musical and technological), social networked identity and on and on. The creative tools are just forming. What remains to be seen is it really a new movement and change in world view, or is it something more akin to the invention of new methods of expression: like cinema, which united moving pictures, dramatic arts and eventually sound into a new art form.

    1. gregorylent

      what you are saying could be interpreted (at least by me 🙂 ) as an indication of group consciousness being seen in the world, simply, a larger awareness is replacing a narrower one, and it is all quite natural …. it does make a few commercial interests squirm though….

  16. killercoder

    I have been looking over your blog for a while, and I was thinking if will like to echange link with me. My Site URL is http://symmetricsolution.net. It will be an honor if you exchange link with me. Thank You.

  17. Steven Kane

    i think this debate is maybe apples and orangesmodernism and post-modernism were/are philosophies.schools of thought.ways of seeing the world.the art/culture/artifacts created within the movement or philosophy were/are supposed physical representations of those philosophies (at least in the eyes of critics and observers; most artists usually bristle at having their work pigeonholed.)”participatism,” on the other hand, while bullseye relevent to any discussion of our times, seems more a description of a technique or structure or method.like thinking about jackson pollock as “the drip method” (a painting technique) versus “abstract expressionism” (a philosophy)also, while i certainly agree the internet and other technologies are radically enabling conversation/interaction and audience participation to the point where its absence itself seems like a philosophical statement, i don’t agree that “interaction” is a new or revolutionary technique in the the arts or literature etcits all well and good to say that a book exists without a reader, but really, it doesn’t.a book without a reader is a door stop or kindling material.pick up a text written in sanskrit (assuming you don’t know how to read sanskrit!)is that a book?no, its a meaningless hunk of paper and ink.only through the interaction of writer text and reader does a book exist as a book.thats even more true with dramatic arts. heck, the symbols of drama are the laughing and weeping faces — an explicit statement that drama exists only when an audience participates, both physically and emotionallyvastly better than anything i could say here, john berger brilliantly charts these waters in his 1974 tv series and book, Ways of Seeing (which has dated a bit but is still essential)(gosh, i’m having a flashback to art school – excellent fun.)

    1. gregorylent

      nice comments, and good to recognize that seeming realities are only concepts … interaction is what humans do, and of course continues … did you ever read jack vance? he has a story where each member of the culture takes turns in every different role, and in some ways we are blurring the lines between audience and participator, which is way cool …

    2. vruz

      I absolutely loved “Ways of Seeing”. (it’s a published book too, not sure if it’s still on print, it’s highly recommendable.)Art, media, culture and society can’t be separated from philosophy, it’s impossible to understand the products of a given culture or epoch without digging in the rationale, the thoughts that brought them into being.It’s also not possible to understand the times we live in by resigning ourselves to become a cog in the machine of our times.great comment Steve, thanks.

    3. scott crawford

      Yep, yep. Good stuff indeed.

    4. Steven Kane

      with a little more time to think this over…the effect of “participatism” on aesthetic philosophy (e.g. movements like modernism and post-modernism) will undoubtedly be large. maybe already has beenbut i can see a double edged swordin the end, the greatest effect of participatism is the lowering, or elimination, of barriers to entrywhere previously only a tiny minority of people pursued creative endeavors (beyond amateur or hobbyist status) and the vast majority of folks were basically only consumers,now anyone and everyone can be a creator (as well as a consumer)this is really really really hugefor it removes a lot maybe even most of the *struggle* from creative endeavorswhen i was in film school, every student labored every day with the knowledge that only a miniscule number of us could ever be filmmakers. simply too costly and difficult to make and distribute films.today, my six year old son does it in his spare time (away from kindergarten)but back at film school, there was a natural winnowing process — only those with vision or passion or obsession even bothered to try. many would proclaim with pride “if i have to wait tables for twenty years, so be it. i am an artist and the path is one of struggle.”that simply ain’t true anymorei can see a huge positive effect from this — perhaps a movement which will later be known as “post-cynicism.” in days of old, the majority of artists and creative types had to struggle so much they ended up quite cynical. call it the “van gogh syndrome”. if the singular genius of 19th century painting could not make a living by his art (he never sold one painting while he lived) then the world was nuts. cruel and dumb and indifferent. and can’t be redeemed. just observed and commented on by those with the emotional resilience to struggle on that pathbut maybe that cynicism will go away now. it simply does not need to exist. anyone can be creative and everyone can get access to the world”post-cynicism”but there’s a sad and dark side here too.that acceptance of struggle, that yearning and determination — that passion — that an artists had to have burning in their belly and soul… well, guess that’s going away toomaybe we’re now living in the era of “post-passion”and thats tragic. at least for people like me who think that great art and great literature — and yes, great product — needs to be baked in the intense heat of passionah well. maybe one is the price of the otherand after all, forget philosophy — the elimination of barriers to creativity is a great thing

      1. scott crawford

        Not post-passion. Maybe something akin to post-elitism? In any event, keep those seat belts fastened, and please keep arms and legs inside the car. Operator is not responsible for lost articles or loss of sense of humor.

  18. vruz

    I think the label ‘pseudo-modernism’ may be irrelevant or not, infortunate or not, depends basically on adoption from the public, the artists and academia.I think Dr Kirby leans on the side of the pseudomodern pessimistics, whilst Fred Wilson leans towards the optimistics.Amanda Chapel, as we have discussed on Twitter, seems to be leaning on the pessimistic side as well.After a couple of days of reading and re-reading this essay (which I believe is an important one) my position continues to be an ambiguous one.When I look at society at large, the so-called “web 2.0” still has not reached a significative portion of the world’s population.Students who live in agricultural economies (those of the “First Wave” according to Toffler) still don’t cut and paste texts from Wikipedia, they still don’t have a classroom blog, and only casually may have experienced what a computer does.I have been critical of Tim O’Reilly’s conception of Web 2.0 ever since before he crafted it. In my book, he’s on the extreme optimistic camp, he may or may not be politically inclined towards the neocons economy ,I don’t know him personally, but there’s striking parallels with the dripping effect, the ill conception that it will automagically make a change for good, in the web 2.0’s case a pinky rosey future where everyone touched by web 2.0 will be able to have their voices heard, will be part of a bigger revolution, will be empowered and will solve every day’s problems by the grace of Web 2.0. That’s extreme optimism, or more accurately defined as “selling the Brooklyn bridge”.As you can tell, Fred, we have a different point of view about O’Reilly’s Web 2.0.My concerns are ethical.My questions are not as much as how to define waves, eras, or schools of western thought.My questions –as a technologist– are more about this powerful tool we have at hand, kind of an Oppenheimer dilemma.What do we give to the rest of the world who haven’t had a chance to develop yet ?More porn, more Microsoft Windows, and then the logical next step… more the Mac Donalds of Web 2.0 ?Or instead of simply evangelising participation economy because there’s a feelgood participationsim thing to it, are we capable of embracing an ethos of MAKING GOOD.Google’s “DO NO EVIL” was a catalyst for Google. But simply not doing evil doesn’t mean one is proactively doing any good.Can we harness technology to make media, education, culture and society less vacuous ?Are we able to promote ethical standards ?I’d love to hear more from people who can believe it is possible.

    1. gregorylent

      nice comments, and thanks for the impetus for this discussion ….what indian villages have is a whole lot of gossip, a kind of geographical omniscience, everyone knows everything that is happening everywhere, and everybody is involved in everything, pretty much …. cannot get more 2.0 than that!one could say that what we are doing with tech is what humans have already always been doing, for better or worse … the means are developing, the scale is expanding, but the village well is still where it is at, as far as real life goes. friendfeed is only a really rough approximation

      1. vruz

        I take it you’re leaning on the pessimistic side and that no matter what we do, the lowest common denominator is what human nature was and will always be ?and therefore… because it doesn’t matter what we do… and human nature is and will always be what it is… we can do anything, and it won’t make a difference ?:-)

        1. gregorylent

          no, i am amazingly, irrationally (such a limiting pov, rationality) optimistic … i think the whole flow is leading to a wider/vaster/deeper unfoldment of human awareness/potential/possibility ….. and that when it is “found”, one realizes it was always there/here/aroundlife is an unfoldment, consciousness can already do all that tech is trying to manifest externally, realizing one’s connection to all is the whole story of being born, and that when one does, there is no such thing as lowest common denominatorbut just try to get funding for such a view!! lol

  19. Emil Sotirov

    I like the term “pseudo-modernism”… It describes well something familiar to me and some people around me (like my wife). We identify with things “modern” – but not in the traditional way of “believing” in them. It is more like a rather plain (emotionally dull) pleasing with a nagging feel of “fake”-ness. And the unbearable lightness of “silent autism” – very real too (not in a clinical sense fortunately). Sometimes we think it is due to our truly hyphenated cultural DNA – francophile-russian-speaking-slavic-east-european-bulgarian-american-patriots – which makes us understanding-a-lot-still-not-truly-capable-of-articulating-it-preferring-to-stay-silent-most-of-the-time.And “the trance – the state of being swallowed up by your activity” – a salvation… wasn’t that the true prescription of Dr. Deconstruction – practice as the only thing left… no reflection, no much talking “about”…Anyway… here is something I’ve written way back in 1992:”…a discourse produced as existential need, not as instrument. In a recent interview, Venturi says that he found himself in “Complexity and Contradiction.” There are no clearly articulated intentions, plan, strategies, purpose, conclusions. The discourse is not produced as “useful”. This does not mean that it could not eventually turn out to be very useful….This poly-logical rhetoric is not an evolution of author’s ideas, nor is it a revolution against ideas of other authors. It is a co-evolution and ongoing mutual displacements between a personal stand and cultural context. There is no author in the traditional sense (author/audience). The thinker/speaker/writer/designer is a mediator between cultural realities… – facilitating tendencies for self-organization which are always pre-existing in any context… There is no text, but always, and only, a con-text……This… discourse necessarily produces confusion – it con-fuses experience and knowledge (semiotic and symbolic). It does not “translate” experience into knowledge……This rhetoric is suspicious. Because it reveals a fundamental undecidability.”Full post here: http://sotirov.com/2004/08/…And then came the Internet… and then came the Web… and we found our home… 🙂

  20. vruz

    In my case (moreover bearing a first name like mine!) coming from a long latin tradition, but also of a very mixed heritage, I think I understand what you mean by that hyphenated kind of silence.I wouldn’t voice out loud my philosophical rants during dinner with my spanish catholic grandmother, or my german protestant cousins, or my right-wing leaning italian relatives.But Freudian motives aside, I believe we should be capable of reusing this rich heritage in such a way that allows us to not only candidly understand their points of view, but also extract valuable teachings and tools to understand and act upon the complex world we have before us.Looks like your paper touched several of the points Kirby attacked.I’ll read your article with interest later tonight. Thanks for sharing.

  21. scott crawford

    At risk of opening a-whole-nother can of worms, how does Objectivism play here? Certainly much of what we’re wrestling with here is the role and rights of the individual, and the falling away of more classical gatekeepers. Seems to be relevant. Or mayhaps I romanticize too much.

    1. vruz

      I have mentioned open source and free software elsewhere on this thread.One could argue web 2.0 couldn’t possibly exist without Linux, Apache, Mysql, Perl, PHP, Python, Ruby.All expressions of open source and free software philosophy have deep roots on the rights of the individual to read, learn, understand, and with them the rights to create and share, to help others and help yourself.Then came Google.Then came Lawrence Lessig, and Creative Commons, and then Flickr and Youtube and others.And we got great technological power, but I see no sense of responsability.I see corporations creating matrices to accomodate the products of culture, the communications and even the relationships of individuals.Then individuals accepting the status quo as inevitable, happy to be given a small corner of their data centres.I think the public can benefit from these great technologies, but there’s a lingering danger of turning open source against itself, as more and more conform to the easiest path (let someone else build data centres and software for you) then history could be repeating again with only a few being able to read, write, and understand. (and therefore only a few able to teach, collaborate, share)I don’t claim I have the answer, but the answer is –as always– probably a healthy point in the middle.

      1. gregorylent

        if i understand the essence of your posts, you are wishing to improve human character, and are wondering if technology can do that?

    2. mastermark

      In my opinion, it plays as much or as little as anything else. The software in discussion here is social in nature — it enables social behaviour. In a recent conversation with somebody at the Enterprise 2.0 conference, I pointed out that this is a hard problem — many orders of magnitude harder than, say, calculating an insurance policy. At it’s most extreme, that means dealing with everything from Palestine vs. Israel to Romeo and Juliet — in other words, the entire spectrum of human experience.This is what I think gregorylent means by “when it is “found”, one realizes it was always there/here/around”. People, being people, try to interpret new and unfamiliar experiences through the lens of their own experience — after all, what else can one do? Kirby sees “pseudomodernism vs. everything that came before”. Amanda Chapel (about as post-modern a construct as you could possibly hope for) sees everything as a question of propaganda and markets. Ayn Rand would no doubt see the relevance of Objectivism lurking behind every proud blog.Shrug (and I don’t know about you, but I’m hardly Atlas). It’s all good.

  22. Chris Dodge

    Fred,In reading this, I think you might enjoy reading Henry Jenkin’s work on “Participatory Culture” . He’s a Media Studies professor at MIT. Check him out if you get a chance.

  23. Amanda Chapel

    Just look at this discussion. Perfect example. Good intention aside… what’s the outcome?Look: No rules. No discipline. No authoritative anything. No conclusions. No learnings. NO VALUE!Hell, to you community advocates, did it serve to bring us together or separate us? IT SEPARATES US!Ya know… during a tsunami there’s always a group of people that will walk down and follow the tide out in awe of Mother Nature. Fred, Mark… I will see you when the tide comes in.

    1. mastermark

      Interesting logic. You think learning and value can only be products of rules, discipline and authority? And we’re engaging in this discussion here, so that… separates us.I’m not in awe of the flood, Amanda, as you well know. But I don’t think it’s particularly worrying, either.

    2. gregorylent

      “Look: No rules. No discipline. No authoritative anything. No conclusions. No learnings. NO VALUE”conclusion: we need a dictator in order to have value?

    3. Mark 'Rizzn' Hopkins

      Amanda, perhaps if you read the responses, you might find things you agreed with or found interesting. I might be wrong here, since I don’t have a camera trained on you, but from your responses, you don’t really appear to be responding to anything in particular except restating your opinion with various barbs meant to insult the very premises of not just folks’ positions in writing, but positions in life.The only thing intending to cause separation here is you. Unfortunately (for you, I guess), you’re failing, as every bit of commentary and back and forth serves to create more loyalty as well as attention and community. There’s a conversation happening, and even you are interested in seeing how it turns out.You keep coming back and contributing. That’s the definition of bringing people together. You’re being brought back to this community in an attempt to engage the rest of us (albeit in a somewhat hostile manner).

  24. fredwilson

    as always, the comments are about 100x better than the post.including you Amanda, whomever you are

    1. tweetip

      we’d love to see amanda in a video comment! 🙂

  25. Lloyd Fassett

    I think it’s remarkable that business and art theory would be in the same sentence. That says a lot more about what’s being said, than what’s being said.

  26. Paul

    Pseudo modernism (if you want to call it that) seems like the long tail of ‘value’. Something that has tended to be hidden from us because established elites have had a monopoly on the means of creating cultural goods. Give those means over to the people and you get an abundance of what academics might term ‘vacuous’ cultural goods, but people love ’em. Try telling the 14 year old American Idol fan that her favorite show is void of meaning and worth?I agree with an earlier posters comment that this has always been there. It’s just been hidden.

    1. vruz

      the ‘means of production to the people’ thing rings close to the marxist reading wmfischer commented about earlier in the beggining of this thread.part of the discussion is precisely about the vacuity of the content.teenagers have been producing poetry and other art of great intensity but little direction for ages.the whole purpose of it being self-expression.no new ideas, even though it feels like they’re new, because everything is new when you’re 14.they just have to get it out. no matter how. no matter what. I was there and almost every teenager will be in a similar situation.it has happened to every generation since we were granted the right to read and write.new ideas and substantial works aren’t normally developed at an early age, judging from statistics.of course there are a few exceptions that confirm the rule.I’m a big fan of Isidore Ducasse (Lautŕeamont) and I’m a friend of Christian Neukirchen, the guy who invented tumblelogs when he was 16, and then there’s Karp who started tumblr at age 17, or Zuckerberg starting Facebook as a business at an early age.but not everyone can afford being little geniuses, the vast majority of us have to work our asses off to gain a depth of understanding and discernment that is uncommon to teenagers.the question is… is this all we’re gonna be ?is this all we’re going to produce ?will this be the most representative body of work of humanity in the coming decades ?don’t we have an ethical obligation to do better and encourage a true development of the real potential ?or are we to be like lazy lousy parents letting them have their way so they get an easy instant gratification and they don’t annoy us much ?can we do better ?

      1. Paul

        The ‘means of production’ reference is simply an economic one, you can draw parallels to Marxism but I don’t think they are wholly warranted. We’re not talking about a redistribution of wealth or labor, just a realization of the extent and intensity of exactly what you are talking about – the powerful force that is self-expression. Is it of value though? Can we do better? Can we raise the ‘average’ level of ‘skill’ to combat what you say is the vacuity of the current content?Well, who is the judge? I think Facebook is a poorly designed, poorly run, poorly thought out proposition as a business and its CEO a deer caught in headlights – you talk about him as a boy genius. And can there be any other content more vacuous than the vast majority of what is on FB?You’re making a value judgment about the value of self-expression. So can I. But we’re not going to necessarily agree. And I know it’s a catch phrase these days, but that is exactly where this long-tail of value is coming from. It’s not about the value you find in all of the different and diverse ‘cultural goods’ that get produced, it’s that they got produced at all in the first place because SOMEONE thought they were valuable enough to devote a bit of time and effort to. I don’t think they expect us/you to fully understand. And nor do they care how vacuous us/you think their work is.

        1. vruz

          I don’t think they care either, but obviously you and I do care, we wouldn’t be dedicating time to this discussion otherwise.Some people may choose to help raise the average level of skill, and before they do, this is the kind of discussions they have :-)I can see what you mean about the long-tail of value.We know that certain masterpieces of the body of work of humanity that have stood the test of time will remain –if only for its physical persistence– into existence.We have libraries, museums, specialists and researchers preserving and cataloguing the works of past ages.It’s not exactly the same case for digital works.Is it safe enough to let corporations do that job ? Is it safe enough to let unskilled people vote on the premise of a long tail economy ?Skills to interpret, select, evaluate, critical thinking, ethics… not only skills to produce.Without these skills, the body of digital works of the coming generation could be at risk of being irremediably lost.Not our cranky post-modern works, that we may even not care about anymore… but the digital works that dissappear from existence without human intervention of the long tail vote of skilled consumers.Once the pseudo-modern body of work is out there the skillset of every involved co-author and co-consumer matters.Can you see why it may be necessary to help raise the average level of skill ?

          1. Paul

            Maybe it’s just a learning curve? We are only a few years into this grand ‘experiment’, and I think we are bound to become better at interpreting, selecting, evaluating etc. But only in so far as what interests and intrigues us, and less so about what we think is best for a ‘collective’ cultural movement. The upshot of the long-tail is the deep fragmentation of demand – and in this case cultural value.History tends to get written by the winners. It’s the same for art, literature, philosophy – everything that has come before was born of an extremely elitist system of class and patronage (at least the vast majority of it). Give ordinary people the means to express themselves and you get an abundance of, well, everything. What you don’t get are classical masterpieces as, by and large, few people really cared about them anyway. Academics like Kirby hate this as their world is built on privilege. Kirby abhors the outcome and laments the process (the ‘trance’ like state in which we click keys to participate), but that is the very POINT of the whole thing. It’s participation. It’s taking into our own hands the way WE want to define out own cultural niche. It’s not letting someone in an ivory tower TELL us what is important.That’s the real liberation. That’s the real value.And if on ‘average’ you think that value is somehow less than ideal, it’s hard to say one way or another as there is no real point of comparison. But regardless, I think it will, on ‘average’, improve as we get better at understanding what’s important (to us as individuals), and how to spot that when we see it.

          2. vruz

            I started exploring this notion of the individual becoming — artist, curator, critic, patron and marchand — all in one, in another post, when I used the term ‘micropatron’, because of the technologically-enabled ability to provide patronage to artists by the way of micropayments. (potentially thousands of them supporting any given artist)It is possible that it is a learning curve indeed. But just as we are enabling participation, bundled-in with new technology (something that was unthinkable just 15 years back) shouldn’t we also provide the means for this ‘education’ to be an integral part of the system, instead of merely creating an illusion of participation where the vast majority won’t have anything meaningful to contribute ?Can you see there’s possibly a lot more value in that ?

          3. Paul

            Yes and no. Yes, I agree that in certain endeavors it is advantages to bundle in ‘education’ – how you get to do something/express yourself ‘better’ (learning a craft rather than making a mockery of it). But where this is important, don’t communities do it anyway? Look at the open-source movement. I think you could probably add film-making 101 to YouTube as well. The democratization of cultural goods tends to bring with it the means (if you seek it out) to perfect their expression – to the best of your ability. Again, ‘if you seek it out’ – which many people don’t or don’t bother to, but that will change I think. And is part of the learning curve.No, because sometimes it simply is the participation that counts. Clay Shirky had a great talk about participation (http://www.shirky.com/herec… and I think he hit the nail on the head. It’s better to do SOMETHING, ANYTHING, than nothing; than being a passive, talked-to observer of culture. As he said, even lolcats are self-expression. Valueless in the extreme to me, but I guarantee someone got a laugh.

      2. dsheise

        Vruz, how about the thinking of the web as the global nervous system, an extension of our own when we´re connected? The first thing that seems to be drifting fast to the collective system is memory and I wont go deeper into that here. But then what is next? How about our ability to act just like our nervous system does whenever any spot of our body sends a signal that something is wrong? In Clay Shirky´s book “Here comes everybody” he describes several cases of connected collective action.Modernism created the illusion of the author, separated from the collective. Now this illusion is fading and the collective takes over. The risk of totalitarianism is high, but just as great is the opportunity to produce wikipedia likes with what Shirky calls “our cognitive surplus”.

  27. Rob Long

    Well, about sixteen thousand years ago, when I was in college, my senior paper in English Literature was a discussion of the birth of modernism in fiction and painting. Somehow, I found a letter from Joseph Conrad — I say “somehow” because I wasn’t a terribly enterprising student — in which Conrad inveighs against “the modern.” He was a nineteenth century guy, Conrad, and what he liked was order and process and duty. But he was also a great writer and artist, and he noticed that a lot of what constituted order and process at the time was falling apart.His greatest novels are about the struggle for order and meaning in a world that is increasingly spinning out of control . Lord Jim, atoning for a moment of cowardice and irresponsibility; Kurtz, upstream in an African village, going violently nuts. There’s a reason why T. S. Eliot used a line from “Heart of Darkness” as the epitaph for that great modernist poem, “The Hollow Men” — “Mistah Kurtz, he dead” — as a nod to an old order that the modern world had swept away. Had killed.In the letter, Conrad uses a metaphor. Modern life, he says, is like that (then) new invention, the knitting machine. This thing that knits everything together, without order or discrimination or master plan. It’s like a web, he said. Everything happens at once.James Joyce, who represents in many ways exactly what Conrad feared about modern literature, when he finished that great modernist novel, Ullyses, was once asked to explain his writing. “I don’t write,” he said. “I weave.”So these aren’t really new conflicts. The modern architecture of Corbusier or Mies was a conscious reaction to the curvy, ornamental Victorian/Romantic architecture that came before it. And that was a reaction to the Cartesian order of the neo-classical buildings before that. We’re always sweeping the past away.The painters of the early 20th century — the modernist greats of Matisse and Picasso — were in many ways trying to paint things the way they looked to the modern eye. Picasso always said that his paintings were trying to show multiple views of a subject at once, the way people look when they’re in motion, on a train, in tears, dancing, on film. I was totally mystified for years about how to look at a cubist painting until i went to a lecture by David Hockney, and he said, isn’t it obvious? It’s movement and motion in an instant. It’s watching all the frames of a film at once. He’s painting what it feels like to be a modern person, who can speed along at 60 miles per hour, who can watch a movie.But one of the things Conrad feared was something we all think about when we think about the web: scale. For him, it was the unthinkably destructive weapon of mass destruction, the machine gun, which was for him the knitting machine of death. For the post-modernist architects of the 1960s and 1970s, it was the impersonal, faceless scale of a city with no ornament. Because if everything is made out of steel beams and reinforced concrete, you can go as high or as wide as you like. And we quickly went from Mies to glass boxes to the Houston skyline. To cities that no one ever wanted to walk in, because the scale was off, it was too high, too wide, too impersonal, too indiscriminate.For the past 18 years, I’ve made my living telling stories the old fashioned, Conradian way. On television or in movies — the deal is, we tell the stories, you sit in the dark and watch. And I don’t think that’s dying or even really fading away. People have been sitting around the campfire, more or less, since they invented campfires. And there’s always someone who’s supposed to tell the stories. And most people, every now and then, want to sit back and listen. I know I do.But not always. And not for everything. One of the amazing things about the web — and I’m an optimist, firmly — is how quickly people used it to sort out and consult with other people on exactly how they should spend their “watching” time. Is this movie good? Is this show worth Ti’vo’ing? And you can see that, I think, in what’s on television, in the fiercely competitive race for a smaller slice, but a more devoted slice, of the audience. How many of us know people who unabashedly love maybe five or six TV shows? Shows that are actually high quality? I’m guessing most of us.Sure, yeah, there’s a lot of crap. But there’s also a surprising amount of good stuff. More than when I started in this business in 1990, that’s for sure. And told in the traditional way: with a (mostly) linear, Conradian, old school plot, where the audience sits back in the dark and lets it wash over them.And so now there’s an actual knitting machine, knitting us all up, called the web — it would terrify Conrad; and Joyce, for that matter; and it terrifies a lot of people right this minute, with its huge scale and immense reach and creepy ability to burrow into our private lives. I don’t know if it’s post or neo or pre or pseudo or anything, really. It’s New, that’s for sure, but it seems to respond to a deeply basic human need — deeper, even, than the need to be entertained or amused or diverted. It tells us we’re not alone. It’s our chief weapon in the war against loneliness. That’s what Facebook and Myspace and Twitter and all of those are, it seems to me, deep down.When people in Hollywood talk about the web, they talk about it like it’s just a supercharged, turbo version of a distribution channel. Another way to get you to sit in the dark and watch. And it is that, of course. But what makes it magnetic and mysterious, what drives a lot of smart people to sit around and think up new ways to connect and seek and find and communicate with other people, is its ability to do the very thing that drove Conrad nuts: to knit us together.And that’s a new form of something. Deeper than entertainment. The idea of being able to be connected to hundreds of thousands or even millions of people around the planet is staggering (on the web) and terrifying (in an epidemiological sense).This is a way too long comment. I know, I’m sorry. Let me put it this way: I’ve never met Fred in my life; I came to this blog post because I follow Dick Costolo on Twitter. I’ve never met Dick either, but I admire what he did with Feedburner and I think his Twitter posts are hilarious. And now I’ve spent almost 30 minutes contributing my pompous senior essay to this blog. As soon as I post this, I’m either going to watch some TV or I’m going to dig into Alan Furst’s new novel. I’ve been busily engaged in “participism,” but that doesn’t eliminate my other options of sitting and letting someone else tell me a story.Call it whatever you want — but it’s really just a form of socializing; it’s a version of social interaction that resembles an older, epistolary way of “being” with people. Like everything else, it’s not really new. It’s old. But with scale.

    1. mastermark

      Oh, thanks. What a great post. I’m gonna come find *you* on Twitter, etc. ;)@vruz: you perplex me, a bit, I must confess. You say:”part of the discussion is precisely about the vacuity of the content.” and “will this be the most representative body of work of humanity in the coming decades ?”What body of work are you referring to? The Big Brothers and Britney Spears and Jenna Jameson’s of the world (I actually like Jenna, FWIW. She’s funny)? Rob Long just said:”Sure, yeah, there’s a lot of crap. But there’s also a surprising amount of good stuff. More than when I started in this business in 1990, that’s for sure.”Yes. *Exactly”. There’s a lot of good stuff, too. More than there used to be. Television is easiest to analyse, I find: things like The Wire and The Sopranos are surely worthy of posterity. But if you want to focus on the Net, my question remains the same: why the assumption that the output is only (or even mainly) crap? There’s lots of great stuff too — like, oh I don’t know, say, Paul Ford’s Ftrain (http://www.ftrain.com/). Or this thread, here. ;)It’s this dark, and frankly, cynical assumption that only the dreck is worth talking about that led me to infer that *you* were a pessimist, mate.The stuff we’re talking about has changed the scale, and the intensity, but not the essential *being* of who we are — why would anybody assume that? We have produced much that is awful, in the past, and much that is wonderful — why would anybody assume that new technologies could ever change that essential truth?You keep asking if we could do better — a worthy goal, no doubt. But implicit in your discourse is an apparent (and seemingly deep) dissatisfaction with where we are now, and I don’t really grok that. Consider Myanmar. What has been happening there would have been *much* worse in an earlier age — there can be no doubt about it. I was just reading something recently about the Ottoman Empire, and how the Emperor was able to maintain control over such a huge Reich for such a long time, in part, by playing the far-flung warlords off against each other: he was able to do that because he *controlled the communications technologies*. IOW, the Ottoman Empire, in its original form, is impossible today. So, yes, I suppose you were right, earlier (I realise, slowly, with some surprise): the way I see things, the glass *is* half-full.

      1. vruz

        because 90% crap is just too much crap.glass half full, yes, dissatisfaction guaranteed, always :-)can we do better ?

        1. dsheise

          I haven´t seen anybody here mentioning any current movies that could represent pseudo-modernism. What about Speed Racer? Isn´t it arch-pseudo-modern?Forget the making of it (there is no participation in making the movie itself), but the way it´s presented.

          1. vruz

            um… looks arch-postmodern to me.how is Speed Racer pseudomodern ?

          2. dsheise

            From Kirby´s essay: “Cinema in the pseudo-modern age looks more and more like a computer game…” The flatness of the movie, the rush of infosensations, that´s how I see it as pseudomodern, which by the way I would change to HYPERMODERN.

          3. vruz

            hypermodern sounds good to me.as in hyper-kinetic.it certainly conveys the sense of an indiscriminate amplification of sensations.

        2. mastermark

          Well, that would be nice (if we could do better), and I certainly wouldn’t complain, but… Look, I just can’t shake the feeling that my argument is not getting through to you, and I can only assume that’s because I’m putting it badly. Let me try something else. Elsewhere (a bit further up in this thread), you say:”Without these skills, the body of digital works of the coming generation could be at risk of being irremediably lost.”Are you assuming that the “body of work” that we have now, of the works of cultural value known to us, is a collection of the best of all possible works created throughout history? Do you assume that, throughout history, no incredibly brilliant works of cultural value have been lost? Do you assume that a process has existed — up until recently — that has ensured that such works were always recognised and valued and preserved?If so, I disagree. I think it would be silly to assume anything other than the following: throughout the history of our species, it has been the *norm* (not the exception) for brilliant works of genius to vanish without a trace. Unnnoticed and unremarked. Uncountable works of cultural significance have *always* been created and then lost to us — those that were not lost are the exceptions. Those works of cultural value (art, primarily, but also things like, say, the Code of Hammurabi) are the beneficiaries of a healthy portion of luck.Frankly, that’s one of the things that has us optimists all fired up — consider the possibility that these new technologies *increase the chances of luck for all cultural works*. A rising tide lifts all boats, etc., yada yada.

          1. vruz

            I do understand your optimistic point of view.It’s very well known that uncountable works have been lost.One of the saddest is the story of the Library of Alexandria, I’m sure you know that one well.Uncountable works lost, and we only re-learnt all of it a thousand years later. (or possibly even later, in some fields like mathematics and philosophy)Digitally-based works are very unlike books, or buildings, or paintings.Digitally-based works aren’t automatically persistent in the physical world. You need electric current, perfectly functional machines, tech support and a certain environment safety.None of that happens automatically. It’s our duty to make it happen.And all this stuff, you can pretty much tell it has me all fired up too, not just because of this current opportunity, but it is the craft I’ve dedicated most of my life so far, and will likely continue to be what I will continue doing until the day that I die, health allowing.I’m generally optimistic, but that’s because I prepare for failure, something engineers, architects and financial risk management professionals know well.When you’re designing a bridge, you take provisions so that the bridge tolerates more load than will foreseeably be necessary, you don’t leave it to chance because you are aware the consequences can be disastrous.It’s a matter of ethics, responsability and vision, not optimism or pessimism.And here I’m saying, the consequences may not be disastrous, it all may be for good, the glass may be half full, it all may be even pinky rosey as some paint it.But the risk of loss is just too high we can’t avoid giving it a good thought.And that’s what we’re doing.

          2. mastermark

            Ah, well I’m a system architect, so *that’s* an argument I can wholeheartedly agree with. But the problem of archiving this stuff is an entirely different question from: do these new means of production encourage the production of more crap (which is basically what I understood Kirby to be arguing)? To which I have been arguing, “no, these new means of production don’t change the ratio of crap to good stuff at all, or very little: they just make more of everything (crap and good stuff equally included) visible”.

          3. vruz

            no, it’s not a different argument, it’s a different facet of the same problem.the physical world has built-in entropia, the universe decides which physical works of art rot and decay into oblivion.digital works in bittorrent dissappear from existence when someone pulls the plug, or not enough people in bittorrent are interested right now about a certain work.how can we know which works will be valuable in 50 or 100 years ?how about Kafka’s body of work, which wasn’t published until after he died ?how about Lautréamont’s Maldoror (a case I’m very familiar with) that was published in 1870, censored in France, published in Belgium and only re-discovered by Breton and the surrealists in the 20th century ?how about the discoveries and inventions of Paul Otlet ?No Mark, it’s not a simple matter of statistical interest and long tail economics, it’s much more complex than that and you’re only now starting to grasp what we’re really starting to talk about.

          4. mastermark

            No, sorry, Horacio, that doesn’t convince me. I understand (and admire) your passion, but I still think there are two different topics being conflated here.1 — as a species, we have a (sacred?) duty to try and preserve as much of our cultural output as possible. Indiscriminately. For all the reasons that you’ve put forth, here, and elsewhere in this thread. We cannot allow that process of preservation to be coupled to the market, or economic forces of any kind, of even to any sort of “enlightened” criticism, as that criticism will be bound, by human nature, too closely to its own age, and therefore biased.2 — as a species, we are entering an age of (perhaps permanent) vacuity, stupidity and reduced intellectual achievement, enabled by technologies that discourage deep thought and the slow development of craft.I agree 150% with the first theme, and disagree just about as strongly with the second. (Having said that, the architect / geek in me can’t help remarking that 1 is a *hard* problem, mate…)

          5. vruz

            excerpts from:Cranbrook Commencement Address by Julie Läskyhttp://www.designobserver.c…The question is not whether people should frankly expose pain, which is a private matter, but whether we as a society should be more mindful of what passes for sensation. For it is mindfulness — the steady, deliberate processing of experience-that we’ve been missing when we talk about our hunger for tactility, authenticity, or individuality in the material world. It is mindfulness-the experience of thinking through feeling — that makes craft, craft.As a writer, I can tell you that, although the hand is identified with craft, materiality can be thought as well as felt — you can hear the click of a well-constructed sentence being assembled in your head; you can feel the stretch of a good mental workout. You don’t need to get your hands dirty to be a craftsperson; you need to get your minds dirty. And where I will point an accusatory finger at technology is in providing shortcuts to the slow deliberative process of acquiring a skill, which is guaranteed by working the hand. If you’re not careful, technology, with its immediacy and high resolution, swaps the illusion of mastery or wisdom for the real, hard-won thing. As the sociologist Richard Sennett recently wrote, “The slowness of craft time serves as a source of satisfaction: practice beds in, making the skill one’s own. Slow craft time also enables the work of reflection and imagination — which the push for quick results cannot.”For Sennett, the value of practicing a craft is not just improved technique but also an ability to make imaginative connections to other parts of culture. In a recent interview with I.D., he discussed a widely held belief that 10,000 hours of practice are required to build expertise in any discipline. For the first 5,000 or 6,000 hours, the student simply learns to ingrain the physical gestures associated with his or her craft, whether glass-blowing or cello playing. “But,” Sennett cautions, “if your habit is fixed, you never get better.” So, somewhere between the 6,000th and 7,000th hour, the student begins to apply lateral thinking to develop new habits that are influenced by other areas of culture and work to enrich his or her own. There are social and ethical, as well as aesthetic, dimensions, to this process, in Sennett’s view. And he makes a very convincing case for it.Another curious sign of craft’s emergence is more arcane, but bear with me. In the 1990s, there was a three-word mantra that ruled many of our lives. It was Nike’s slogan, “Just do it.” This message told us that our lazy, cowardly selves — and nothing else — stood between us and self-fulfillment, and if we just had the gumption to try harder, we could be more like Michael Jordan, even though he was such a remarkable athlete, he didn’t even really qualify as human.In this decade, we have a three-word mantra, too. It’s what Tim Gunn of “Project Runway” tells fashion designers who have less than an hour to assemble a dress out of bread dough or mismatched pieces of upholstery fabric. He says: “Make it work.” At a time when all hell is breaking loose geopolitically, we acknowledge that obstructive forces may be external to us. We recognize and even celebrate challenges before we hurl ourselves at them. This being America, however, we’re still expected to draw on our inner resources to overcome those challenges, and there are no excuses. Make it work.Nike’s “Just do it” may have been good for our sense of empowerment, but it’s been terrible for the earth. We’re suffering from a glut of products, many tossed onto the market with a breezy heedlessness that probably has been mistaken for courage. You can imagine how a manufacturer, even after hiring consultants, doing research, and convening focus groups that challenged an idea for new product, said, “Aw, fuck it.” Which is another way of saying “Just do it.” How else can we explain a Procter & Gamble room deodorizer called ScentStories that “plays” like a compact disk, rotating in a heating device that releases a different smell every 15 minutes. “Just do it” is mindless. In fact, it militates against thought. It suggests that if Hamlet played extreme sports or ran a marathon or two, he’d have been a lot happier. The verb is truly active: “Do.” And the “Just” preceding it is a split-second sigh of impatience, a paper-thin slice of temporality before you’re supposed to get off your ass and onto that snowboard.“Make it work,” on the other hand, is about deliberation; “make it work” recognizes the unlikelihood of perfection and the strong possibility of flawed performance. It’s based firmly in time; it represents limits. Most telling of all, it comes out of an age when remarkable athletes have been exposed as steroid users. This directive connects to a process, and the verb is constructive: “Make.” The items it alludes to are idiosyncratic and frequently lumpy articles of clothing — not perfectly machined Nike models or shoes. Sure, “Project Runway” is all about craft, but Tim Gunn could have used any number of catchphrases. This one really captures the ethos of craft, in both its material and mindful dimensions.You’ve probably noticed that I’ve circled back to Reality TV, or at least to reality. Randy Pausch, in his last lecture, urged everyone to “Decide if you’re a Tigger or an Eeyore,” an optimist or pessimist. Sorry, gang, I know I promised you encouragement but I’m with the grumpy donkey.So ending on a more conventional note, here are my own directives:Don’t worry about categories or definitions like craft, design, or art. Philosophers and magazine editors are happy to hammer them out for you, and you have better things to do anyway. One is to develop as many areas of expertise as you can without watering down what you’re good at…..

          6. fredwilson

            Wow. Some great stuff in there. Reblog coming at fredwilson.vcFred

          7. gregorylent

            other cultures would say, go with your strengths, forget developing many areas. other peoploe have those covered

          8. vruz

            it rings true for people coming from a culture with deeply rooted division of labour.but that’s not the culture I was born in, I have the unintentional benefit of being able to understand both.

          9. mastermark

            There’s an interesting blurb about a recent study that echoes some of these thoughts about cultural differences on SciAm (http://www.sciam.com/articl… — “Fishermen Think Holistically”. There’s a link to the original study there as well.The question going through my mind, reading this excellent piece, was something like: how can we know if this is a truth? And not generational grumpiness like the Luddites? We are so close to the problem, so lacking in objectivity… The coming of the industrial revolution produced lots of similar protests about the death of craft, but as I gaze at my iPhone and my Powerbook, there is no doubt in my mind that industrial production can produce works of great aesthetic value.Again, I think about Mynamar. The ‘Net (and social software running on it) is already producing great triumphs…

          10. vruz

            come on Mark, I can seriously go and consider any theory published in Scientific American, and it’s okay if you believe that the (optimistic) 8% marketshare of Apple Inc.’s products in the American market is any sign of real advancement of mankind, I will let you have that if that pleases you.but seriously, introducing Myanmar in the discussion, where the death toll is about 128.000 according to Red Cross, that’s not an area where I want to introduce the relative banality of western aesthetics.seriously.

          11. mastermark

            But I am being serious, mate. The situation there would be even more dire if the regime had been able to limit the information flow more than they did. You’re right that a contrast (unintended, but I see it now that you point at it) between the iPhone and the situation there is inappropriate, however — sorry about that. Seriously. Didn’t mean to make that juxtaposition — typing faster than I’m thinking.Having said that — typing faster than I’m thinking — isn’t that what the essay you quoted is bemoaning? And isn’t it what we’re discussing here, in a broad sense? So, clearly, I’m guilty of exactly that which Läsky and Weeks are concerned about. To which I say: shrug. So what? The result of my doing so is a deepening of the conversation, in all of its messy, flawed, humanity, isn’t it? How is that bad?

          12. vruz

            I think you’re giving yourself a lot of credit here :-)For a good part of the conversation you’ve been making questions and I’ve been responding, and you’ve been (slowly but certainly) coming to a comprehension of the real problems.It’s not just about Kirby and his academic bigotry but about problems of a much more serious depth and consequence.But I give you the credit of persistence and genuine curiosity, which is sadly uncommon these days.

          13. mastermark

            Lol. Fair enough. And as for the comprehension bit, I thought it was you who’s slowly coming round, but again — fair enough. ;)I understand the deeper social and economic (and moral) problems you are alluding to. But my debate club training (lol) keeps compelling me to try and pull you back to (just) talking about Kirby’s thesis — that’s what Fred’s post, and this thread, is ostensibly about, after all.And I stand by my assertion — THIS THREAD is a wonderful refutation of Kirby’s thesis. All by itself. You and I have connected, as have others. You and I could now go offline, and go into deep geek mode, talking about how to leverage Amazon S3 to provide a low cost way of archiving the entire universe (and why don’t we do that? ;)), and that possibility could never have existed without this thing — the very thing that Kirby is bemoaning.

    2. scott crawford

      Love it. Thanks.

      1. SexySEO

        “Post-modernism was the post-war (WWII) reaction to modernism” – heresy and licentiousness LOL nonesencePost-modernism never existed, only in minds of very small group of “20th century great philosophers” LOLThere is NO such thing as Post-modernism – it’s simulacra to cover up the absence of any appropriate “ism” at the end of 20th century.And now at the beginning of 21st century we are not even trying to cover up by any “isms” our modern (contemporary) life. :DBTW why “ism” is so actual for you, Fred?Ohhhhhhhhh, I didn’t mean to reply to your comment Scott, sorry! 🙂

        1. scott crawford

          s’okay. Happy accidents.

        2. fredwilson

          Who are you?

          1. SexySEO

            What an arrogant lad you are!!! 🙂 I thought that VCs should have better manners even if I displayed my interest first. ;-)I’m just silly girlie who has nothing but her boobs playing round on the net! ROFL :-DSeriously: Internet avant-gardist and the person who is involved in Global Social Graphing project.PS ah, yes! And the person who knows that the term “Postmodernism” and the all “Postmodern philosophy” is err…mmm… is an entire fabrication 😀

    3. vruz

      thanks for sharing, Rob.

    4. gregorylent

      very well written roblongyou note …. “And that’s a new form of something. Deeper than entertainment. The idea of being able to be connected to hundreds of thousands or even millions of people around the planet is staggering (on the web) and terrifying (in an epidemiological sense).” ….this connection you refer to is simply the mass recognition of what previously only mystics pointed out, that there is only “one thing” and we are all a part of it, only one consciousness, all of our minds are based in that.they referred to understanding this as “knowing that by which all things are known”it is not only a way of being with people, it is a way of being yourself, who you really are, and have always beenthanks for your writing, will look at your site

    5. vruz

      Here are some valid points made to think it is something new.http://www.washingtonpost.c…excerpt follows:……The demise of orderly writing: signs everywhere. One recent report, young Americans don’t write well. In a survey, Internet language — abbreviated wds, 🙂 and txt msging — seeping into academic writing. But above all, what really scares a lot of scholars: the impending death of the English sentence…Librarian of Congress James Billington, for one. “I see creeping inarticulateness,” he says, and the demise of the basic component of human communication: the sentence. This assault on the lowly — and mighty — sentence, he says, is symptomatic of a disease potentially fatal to civilization. If the sentence croaks, so will critical thought. The chronicling of history. Storytelling itself. The Internet revolution, Billington says, creates new possibilities for people to be in touch with others, but it could also lead to a gobbledygook language without sentences and punctuation and paragraphs — and with less understanding of the world and its meaning. ‘We are moving toward the language used by computer programmers and air traffic controllers,” he says. “Language as a method of instruction, not a portal into critical thinking.’”Once kids stop being able to form coherent phrases, it does seem like a good time to reevaluate……

      1. Rob Long

        I’m sort of with him on this. He’s making some great points, and it really pushes all of my fuddy-duddy buttons, frankly, when I come across that awful text-speak. There does seem to be a general breaking down of good, sharp, solid sentence-making. But as soon as I say that, I suddenly feel like one of those old actors like Jim Backus, who always played the cranky, out-of-it dad in those 50’s movies, where the teenagers would bop around and use hepcat slang. “What’s this world coming to!” he’d say, rolling his eyes at the dancing and the shaking and the kid in the goatee saying, “Coolio, Daddio.” Or something.And also: regular, organized spelling, grammar and punctuation are really rather new concepts. They weren’t part of what constituted “intelligence” until way into the 19th century. Spelling and grammar, especially, were seen as regional habits, like pronunciation.The optimist in me thinks that language, anyway, is always evolving. Slang and neologisms are part of the fun. And all of those kids texting and blogging and emailing are at least writing — it’s not as if they’d otherwise take quill pen in hand and scratch out a beautifully composed letter. And as a professional writer, what I’m most surprised at is how excellent the writing is, mostly, on blogs and even on Twitter (!) from people who aren’t Writers but merely writers, amateurs, like Jane Austen and Chaucer. The web is really about abundance, and people like Billington can easily be persuaded that it’s all turning to gobbledygook if they only look at certain places, but if they choose to look elsewhere, they’ll see critical thinking and wonderful writing from all over, from all kinds of people, in equal abundance.

      2. mastermark

        Heh. I’m not convinced. Why do people always assume their immediate experience is something “new under the sun”?Thought experiment: you had a copy of anything by Hemingway to Shakespeare. Do you assume that Will would see that as *progress*? As an *improvment* over the use of the language in his time? Lol. I doubt it. That thought experiment gets more amusing as you swap in different, modern writers. What would Will make of Ulysses (the Joyce version, natch)? Would he like, say, a Tolstoi translation better, and then be startled to learn it was a translation from Russian? Etc.The future always looks like a catastrophe to middle-aged men.

        1. vruz

          that’s a correct appreciation in my view. I don’t necessarily have the ‘cultural equipment’ to evaluate the works of the next generation.but precisely for that very reason, we can’t rely on the survival of digital works to happen spontaneously, or through our current limited view of what comprises a great work of art, and our banal long tail opinion of right here right now.that’s for the next generation to decide.see how those two conceptions run contrary ?preservation of works for posterity and long tail voting ?

          1. mastermark

            Yes, I do see that contradiction, mate. And I agree with you completely. But as I just commented elsewhere, I don’t think it is what *Kirby* was talking about in the essay that got you (and thus Fred) started here.

          2. vruz

            It’s one of the reasons why I sent the essay to Fred in the first place.You know he’s into tech, right ? 🙂

          3. mastermark

            Well, sure, I know what Fred does… Aha! So you want him to *fund* our idea of archiving the universe in the cloud. Got it. You’re right, Horacio. I’m the slow one. ;D

          4. vruz

            I’m not there yet. Besides Google and AWS have a considerable advantage on that one.I have some ideas about global vacuity minimisation though 🙂

          5. vruz

            I was replying to another of your posts elsewhere, but since we seem to have reached at least a mutual understanding of what we’re talking about, I propose closing this thread and taking the discussion elsewhere, on the premise of those 2 concepts we have extracted, without all the Kirby bigotry and all the associated distortion.What do you say ?

          6. mastermark

            +1. Where?

          7. vruz

            I’ve just sent you DM in Twitter.

    6. fredwilson

      What a fucking great comment. Another example of the comment as the blogpost. Bravo.I’ve never met you rob, but I do know dick and he is hilarious, more so inperson.How about this? ­ someday you and dick and me go out for a beer and talkabout whatever comes to mind.I think we’d enjoy it.Fred

      1. vruz

        fucking irresponsible grandious comment I say.again, architects, engineers and financial risk management professionals don’t throw dice to solve problems, they prepare for the worst possible scenario.the extreme optimistic can always have consolation in excusing themselves later because they didn’t know what they were doing.Dick may be hilarious, but this –to me– is serious as cancer.

      2. Rob Long

        Wow. Cool. Thanks, I’ll take you up on that.I sort of feel like blog comments are what you owe to the blog host, sort of like going to a great party with excellent food and powerful drinks and smart, funny guests: you’re obligated to do your part, to justify all of that shrimp and pricey champagne you’ve been hogging. What I mean to say is, Fred, thanks for throwing such a sparkling and electric party for all of us, day after day.

  28. vadadean

    “Bill Gates has won…I’ve got the post-modern blues…”http://www.imeem.com/people…

  29. Matt

    Thanks for the commentary and turning me on to the essay.I too don’t like the term pseudo-modernism. Participatory modernism also feels too long winded. Maybe we should just ditch ‘being modern’. But the essay rings true on many levels regarding approach to and relationships with cultural products.Enlightening.

  30. Daniel Smith

    Splinterism – where everything you search for is in an increasingly far-flung set of placesRemixism – where internet technologies make it increasingly easy to fuse shards of a concept together into something new and very personal (but also quite shareable)

  31. terra210

    “cut and paste”, appropriation, and similar forms are now foundational elements of what is experienced, and have been so since the early 90’s. They have been refined, but they do not define or describe the whole of the current environment. Walter Benjamin’s essay, (we all know this, so I won’t quote it), was the first to define the shift of the author within a mechanized system. It seems the difference now is that communication is more broadly mediated than it was before. The access points into the machinic have been extended, through the use of smart phones and text messaging. The “writers” location is mapped, from the inside out, (inside the machinic system, out to the human), first with GPS technologies, now with more complex mapping tools which are rapidly being developed.So you have a tension, between the location inside a machinic system and outside it which has become more formidable as the technologies become more dense. They become more dense when they use more of our senses. The more senses engaged the less we feel the gap, yet the more we feel the tension, between us and that which is machinic. I am using the word machinic for a reason. To make a distinction of resolution of material and organization of that which is material within a system. We too are matter and organization, but at a much higher “resolution”.If now, our connection is more mediated,through image/movement and sound,these components start to fill-in for written language. This is why to me, it seems the current>future state of communication within this system, is assembling an extended form of pictogram or hieroglyph. We can see traces of them with our little avatar images posted here. Soon the text and image will both integrate and transform, into a new, more compact useful abstraction. The compacting of text, which is very succinctly balanced in Twitter, (140 characters is just enough for some focus), is the first steps towards becoming less attached extended written forms of language. IM was only a notation tool. It will be our new language, but come on so fluidly that we won’t notice. Derrrida was right, in Grammatology.I am playing here. Forgive me…But it is fun.

    1. mastermark

      Agreed. (To be read to that old 80’s track (can’t recall who it was) “The Future’s So Bright, I Gotta Wear Shades” as soundtrack) 😉

  32. lba

    Here, as a sixteen-year-old growing up immersed in ‘pseudo-modernism’, are my thoughts on the essay; often I have rebutted Kirby’s claims:CGI frequently [inadvertently] works to make the possible look artificial, as in much of Lord of the Rings or Gladiator. Battles involving thousands of individuals have really happened; pseudo-modern cinema makes them look as if they have only ever happened in cyberspace. And so cinema has given cultural ground not merely to the computer as a generator of its images, but to the computer game as the model of its relationship with the viewer.Erroneous: remember that this “[giving of] cultural ground” happens “inadvertently”. Pseudo-modern cinema does not make “them look as if they have only ever happened in cyberspace”, as this happens accidentally, a by-product of the production.The cultural products of pseudo-modernism are also exceptionally banal, as I’ve hinted andThe pseudo-modern era, at least so far, is a cultural desert.Remember YouTube, blogging, Jamendo, Flickr, free software; not at all “banal”!The world has narrowed intellectually, not broadened, in the last ten years.Overall, yes, but with an increasing intellectual trend which I see within my peers• Discussion of books, lyrics; interpretation of traditional culture (eg, TV shows); interpretation of ‘pseudo-modern’ culture.Pseudo-modernism is of course consumerist and conformistVery bleak view: here suggested that people, whilst assumptive of their necessary participation, are unaware of their own subliminal guidance by the creators of the pseudo-modern texts; what happened to individuality, creativity? Regressional: not necessary.Here, the typical emotional state, radically superseding the hyper-consciousness of irony, is the trance – the state of being swallowed up by your activity.It is still possible to be completely immersed, “swallowed up” by traditional culture: reading a book, creating its world inside your head; playing an instrument in a band, unaware of the audience.And indeed, here I am, commenting, thinking, being intellectually productive, creative, responding on the Internet in a typically pseudo-modern way, and (hopefully!) not being in the least “banal”.On most of the other points that the essay makes, I hear an echo of thoughts I’ve been having for a while now.

  33. Jesse!

    You know what I’m finding from all this? That people constantly think they are coming up with new ideas because the internet has accelerated interaction and interactivity. “pseudo-modernism”? Puh-lease… it’s all still very much post-modernism, the kids just call it “meta” and think they’ve come up with something new.How come no on has mentioned Mashall McLuhan in any of this? The guy predicted interactive television and basically the internet. In other words, he’s said all this before, but never tried to call it anything but post-modernism.then again, I was born before 1980! (oooooh!!!!)

  34. Marcus

    While their article, and yours, doesn’t directly talk much about the generations involved, I think it’s worth factoring that in, and I think it gives a perspective that the “lost” generation, Gen X, is complicit in the changes in a deep ways. In fact, looking at the BoingBoing vs Violet Blue debacle, which is what brought me here, we see a fight between Gen Xers who happen to be well known, even if not well-loved, as agents of change.We’re also seeing that the technology that enables all this change is really the boring part. We’re using it in ways that surprises us, and it’s a wild ride. But make no mistake, Gen X, and Gen Y are working together in a complex and unexpected dance to overturn the horrors of the baby boomers and the horrors of the “greatest generation”. Now it just remains to be seen what our horrors are. Speaking of which, Gen Y are considered by some to be the the likely and cyclical “loyal followers” of whatever new despotism takes shape. From Google employees to invading footsoldiers that seems just about right (to take an Americentric viewpoint).

  35. Marcus

    @Jesse!: I echo your sentiment too. Everyone is special, just like me 😉 They can’t be what came before, they are unique and new and gonna change the world, can’t you tell, LOL. Yeah, it’s retarded in that sense, especially the original article, but pretty interesting as a batch of introspective observations, just not all that new.(We should however factor in the possibility that there are evolutionary adjustments to intellectual capability, and that those happen quite quickly, so I guess I’ll argue both sides, its fun. If that’s the case then Gen Y is really really hyper intellectual and really really stupid analytically, but I’m biased toward Gen X, heh.)

  36. KC

    Hello. Umm i would just like to point out, as i have briefly skimmed over this article, that the distinction between user generated and user-necessary culture as being away from postmodernism is silly. It seems to me that this is one of the propagating products of the post modern era, where such a rejection of ultimate truths leads to pockets of people with their own belief system. Like we can say those people on myspace reddit and facebook are all holding different value systems and they allow those sites to exists – creating their image themselves. however they all must agree on a common meaning on such a culture by accepting the legitimacy of it’s existence, so it must have meaning to them. Equally, we could say that such user generated sources do have a demograhic.text messages again are very much existing within late capitalism from mobile phones.What im gettin at is that this psuedo modern distinction is really just addressing the social phenomena within a post modern/late capitalist age.i’d love a reply.

  37. Bryan

    Nicolas Bouriaud (curator and writer of “Relational Aesthetics”), is arguing for something he’s calling altermodern.Check out the Tate Britain website – http://www.tate.org.uk/brit

    1. fredwilson

      Oooh. I like that very much!!

  38. Cornelius Talmidge

    Participatism is too positive of a term for this. He uses the word -pseudo-(/fake/apparent) to stress that the individual has a false sense of authorship/freedom/power/control etc. I think calling this state of human relations pseudo-modernism suggests that recent modern culture had true individual power and authorship in day to day relations which is strange because that’s untrue, we just didn’t have the false sense of it as we have today. This is rely interesting by the way . I can use this in my thesis. Thanks!

    1. fredwilson

      If you do send it to me when you are done

  39. Gilles Deleuze

    Pseudo-modernism sounds a lot like post-modernism to me; pluralism, personalization of experience, non linearity. These are all characteristics of post-modern thinking.

  40. C Dyer

    I really enjoyed this post and all the trailing comments; I feel much smarter after having read this.I do think that I might have some personal insight into this. Also, since (1) I am an artist and (2) I am just starting to understand post-modernism, I’m going to have to substitute words here, so that I can explain this from my perspective.I’m an art student right now, looking at my senior year for my BFA. Its a small school with a traditionalist bent. The students, over the course of a decade, have begun to push the curriculum from Classical image-making towards modern and even post-modern image-making. The best thing about evolutions in thought is that when they start rolling, they pick up speed fast. I think I’m starting to see this hyper-modern trend gain traction already, when post-modernism was breached really only a few years ago.I would say that I sit squarely on the cusp between post-modern and hyper-modern. I’m a good seven to ten years older than some of my classmates; I was born around the time when Kirby estimates the schism begins; all my relatives in the same non-chronological generation as myself are in their 40’s and 50’s. I’ve taken in a strange mishmash of perspectives in my time. I’d like to think that this would prepare me for this participatory essay, but I’m afraid it hasn’t.I’m not sure if I can accurately explain the hyper-modern mind, but I can sort of give you a caricature. The first thing you need to understand is that there’s a greater sense of inter-connectedness. It’s not the viewer, isolated; it’s the viewer as part of a larger whole. The niche, the fan group, the message board, the mailing list. Group theory in all its glory. There’s a vacuity, really, in the hyper-modern, but that vacuum hides an innate understanding of the solutions to the problems post-modernists struggled with. They’ve been dealing with deconstuctionism as long as they’ve been dealing with Nintendos and 24 hour news channels. “Jungian archetypes” will never appear in their lexicon, but the ideas are pretty much just understood. Most understand Freud’s ideas even if they can’t name the man. The sense of communal experience is very important to the hyper-modern discourse. The assimilation of pop into every facet of life. There’s an extra level of disbelief that must be overcome with the hyper-modern eye, as well. They are used to having to buy into mind-bending post-modern ideas (somehow), what is required is to convince them to bother to participate. Compare that last sentence to the idea of browsing the Internet: (search), browse, select, interact, (share). They are choosy. They are jaded from birth. They must be convinced to participate, and if they are properly rewarded, this one experience quickly becomes communal. They are culturally ravenous. They are obsessive.Hyper-modern artists are conspicuously conscious of the ideas of brand and selective purchase. They know the most people will not buy into an album if they can get just the one song they like, unless, of course, there is a brand identity issue at hand. They apply this idea to their art. They are more part of their society than post-modernists, but in that they are far more consumers than the generation before them. Relations between marketing, advertising, design and art — already so blurred — will probably completely melt. I foresee more “celebrity artists” like Jeff Koons.Just a moment to drop back on the idea of how the communal aspect of this movement works: consider how the arts — creation, curation, and patronage — occurring in an anarcho-syndicalist paradigm, with nearly total connectedness between small groups and federations alike. Between this and Group Theory, I’m hoping that you really get what this movement is capable of. I, personally, fear that it will be a younger, more attractive Koons with blander art.I think, when you folks bring up arguments about the visibility of work, the “good stuff vs. crap” argument, the argument of the viability of digital media, and the concept of craft, you have brought up what will be the end of the hyper-modern movement, which I predict will precipitate the anti-modern movement. The relationship between artist and consumer (exacerbated by economic crises) is called into question, the idea of craft and production (as opposed to efficiency and reproduction) is reintroduced. I think the images will find themselves more concrete and applicable to the world; I think representational art will find a resurgence, but also a continuation of artisanal craft. Returning to old materials and creating new materials that will infer permanence.

  41. aimotukiainen

    As long as the most popular pseudomodern piece of production is “Charlie bit my finger again” I’m happy to remain just plain modernist.