Feeds:
Posts
Comments

Archive for the ‘traffic’ Category

First, we will bring ourselves to computers. The small- and large-scale convenience and efficiency of storing more and more parts of our lives online will increase the hold that formal ontologies have on us. They will be constructed by governments, by corporations, and by us in unequal measure, and there will be both implicit and explicit battles over how these ontologies are managed. The fight over how test scores should be used to measure student and teacher performance is nothing compared to what we will see once every aspect of our lives from health to artistic effort to personal relationships is formalized and quantified.

 

[…]

There is good news and bad news. The good news is that, because computers cannot and will not “understand” us the way we understand each other, they will not be able to take over the world and enslave us (at least not for a while). The bad news is that, because computers cannot come to us and meet us in our world, we must continue to adjust our world and bring ourselves to them. We will define and regiment our lives, including our social lives and our perceptions of our selves, in ways that are conducive to what a computer can “understand.” Their dumbness will become ours.

 

from: David Auerbach, N+1.  read it all.   

 

I love this piece.  Brilliant synthesis.  Hard to prove… just have to watch it all unfold.

Read Full Post »

I am so disappointed.

Mysticism returns to prime time TV with this inane crime stopper series “LIE to ME*” heralding the star (Tim Roth) and his team’s ability to read people’s faces to tell when they are lying about what. Crimes are just the medium for the law enforcement to clean up with all that legal mumbo jumbo.

Forget the advance science of real life CSI groups who offer empirical data as evidence supporting suspicion of involvement or not that is shown or implied in other TV dramas. Too many big words and too much emphasis on logic over folklore. That was wayyyyyy to tough to understand.

So, I guess the Vietnam war injury from a concussion grenade will not get mentioned in the villain’s arraignment. We’ll be able to tell if President Obama really is going to address the issues of the day and, most importantly, whether or not he is embarrassed to have a middle name of “Hussain” after all.

Working with this fantasy, think of where it could all lead: you are successful based on not being able to terse your lips or raise an eyebrow due to Botox.  No more need for matters as suspect as a ‘Twinkie defense.’  It was a facial tick that sealed the doom that the Olympian was using banned substances… Or, your movie is given the green light because you looked the producers in the eye and your nose didn’t flare at the same time…

If only we knew what to look for before Columbine and West Virginia events… And all along those media mongrels were leading down the path of science, contingency management and stem cell hope. But no more…

Enter the latest version of phrenology** and voodoo*** for prime consumption.

I am so disappointed.

* Not the absolute blues-grunt-rock of Jonny Lang’s live version of “Lie to Me”

** Phrenology: a defunct and debunked field of study, once considered a science, in which a person’s personality was first implied and then determined by experts “reading” bumps and fissures in the subjects skull.

*** Voodoo: religion based on mix of Roman Catholic teachings and West African beliefs that there are numerous deities subordinate to a greater god spirit (who does not traffic in matters or events of mere humans). Prayers and incantations to lower gods who show their work by symbolism in everything from tea leafs to smoke – only coincidently related to the smoke from a sacred chimney announcing a new Pope.

Various Blog Coverage:

TV Addict

Chicago Trib

Televisionary


Read Full Post »

Could socialmode.com be the next breakout on the Interwebs?

Social Mode Crushes Traffic Expectations

Social Mode Crushes Traffic Expectations

Be part of the dream!  Comment!  Suggest! Link to us!  Put us on Digg!

hahahah!

Is kinda nice to see some traffic, even though all the money goes to WordPress.com right now.

Oh well.

Read Full Post »

LinkedIn has been a long time darling in blog discussions, VC conversations and strategery sessions.

Techcrunch brings us news of their latest shuffling of the exec deck.

It’s one of these media ventures that seems like such a great idea but really isn’t.  It has no growth potential left and is a media product that grabs user bases that really aren’t worth that much to advertisers or big exec firms – for example… the unemployed middle-aged, middle management white guys.

Now, don’t get me wrong.  I like LinkedIn as a place to hosted my resume and occassionally link to people.  Unfortunately, I think that’s just about what everyone uses it for, excluding marketers and recruiters.

This isn’t so unusual.  Almost all other online job boards, professional networking and recruitment sites go the same way.

Site gets announced

Some people join

It catches fire and everyone puts their data in

The recruiters swarm

People get annoyed, bored, or don’t land that dream job and most accounts/profiles age

Network/Site starts blasting out emails and alerts about people looking for you, jobs waiting for you

That powers the site for years

Revenue grows slightly but never breaks out

Next site crops up and chips away at existing sites margins

LinkedIn’s traffic has fallen off.  It basically has found the 16 million people looking for jobs it can offer them.

People not in the market don’t update their profiles.  The folks most likely to power big network connection growth, don’t need a networking site. Folks who want a bitching resume just go off and build one at their domain or facebook page.

Put all of this together and LinkedIn is another Monster.com.  It’s big, has users, might add a bell or whistle but has no place to go.   It’s fate is set and completely determined but what it actually is – a job board.  It’s only hope to grow is to somehow magically get people more jobs than any of the other methods. It won’t die as people always need networking and jobs.

I predict some old skool news media company will buy it one day and squeeze it for revenue.  It’s a safe haven for the weary Internet exec.

Read Full Post »

Actually this is a provocative title to get parents and teachers to read online crap. Kinda ironical, don’t you think… it is supposed to sound like concerns from worried parents.

One brain scientist at UCLA, Gary Small, a psychiatrist, argues that daily exposure to digital technologies can alter how the brain works. “Brain scientist” does not equate to brainy scientist!

While violent and porn have received a lot of public attention, the current jive goes well beyond concern and elicits fear. Media hawking ‘scientists’ purport that the wired world may be changing the way we read, learn and interact with each other. Dah…

Dr. Small claims that brain circuits involved in face-to-face contact can become weaker due to the time and exposure to digital media. Of course he offers no data and the directionality of the changes is impossible to determine if they empirically exist at all. …did the person select a digital world because of his or her brain or did the digital world change the brain by being less emotive, less rewarded by being around people?

Small says the effect is strongest in so-called digital natives, for now. It is the teenagers and 20s and 30 year olds who have been “digitally hard-wired since toddlerhood.” [Is pop-science the same as junk science?]

More than 2,000 years ago, Socrates warned about a different information revolution. He knew learning was important. Yet, he lectured that the rise of the written word was a more artificial way of learning than the oral tradition. More recently, television sparked concerns, then movies, then video games that would make our precious youth more violent or passive and interfere with their education. It even was rumored that TV watching interfered with their sight, fantasy development and ability to do good in school. YIKES!

There isn’t an open-and-shut case that digital technology is changing brain circuitry in any way different from an athlete’s brain or a student’s brain changes due to plasticity… those things a person does change the neural work paths of the brain so that the person doesn’t have to relearn everything they did yesterday all over again when they do it today.

Not enough scientists and non-scientists are skeptical of digital fear mongering. It appears to be a way for doctors to get copy in online and print media. I got some articles off the web on this…. There is little to disprove or prove the digital fear speculation.

Dr. Robert Kurzban, a University of Pennsylvania scientist states the obvious: he says that neurobiology is complex and incomplete and there is still have a lot to learn about how a person’s experiences affect the way the brain is wired to deal with any interaction including social or digital ones. They are separate issues: neurological wiring AND social interaction.

It appears to many in education and science that social interaction is a reinforcer just like food and water. Deprivation and overload appear to work in a similar fashion as anyone who has ever been in jail or from a large family will attest. Montessori educators have practiced a version of education and development that maintains that each student gets just what they need when they are ready to process it and there is not an absolute course on when, where and if that is going to happen or should happen.

But anything we do changes the brain due to plasticity. Even Googling. Some scientists suggest the brain actually benefits from Internet use which is equally silly as to claim that the brain is harmed by all things digital.

The developing brain builds pathways as learning occurs that gradually allows for more sophisticated processing. This is true of car mechanics and interpretive dance. It is also true for learning scripture whether it is based on Buddha, Mohamed, Christ or Jim Jones. It is all the same to the brain. Early on, “stuff” that isn’t used gets sloughed off in a pairing of dendrites and neural wax that keeps the brain working efficiently. Over time the 100 billion neurons with their 100,000 connections each come to grips with the environment, internal and external.

Children do more reading earlier online rather than Dick and Jane books at school. There is more and greater variability online than even seasoned educators can grasp. All and all, some parents can’t absorb or rationalize it. Yes, games are played to a frenzy. Yes, there is stuff out there that makes a sailor blush. No one knows how it will all turn out. There is also a bit of “Dr. Suez was good enough for me! Why do you have to be online all the time reading about arbitrage and the credit crunch or the net worth of Hollywood’s stars under 21 on Yahoo?”

For my 20 cents we shouldn’t have such a narrow view of children, humans or animals to rely on some aspect causing a great hole or scar in their behavior or man’s treatment of others. That flag is already waved by organized religion. They have a lock on it except for what is being played out digitally in games. We’ll see what happens tomorrow.

Read Full Post »

Data once was a signature, a number on a driver’s license or even a newspaper subscription. Now it is much more but less of what you are used to accounting for. Digital information is today recorded by all manner of sensors you are not aware of and don’t see the consequences of. The new reality is data ‘Reality Mining’.

From phones, GPS units, RFID tags in office ID badges, texting, scans of your car through toll booths, credit card activity at ATMs, stores, gas stations, phone call tower identity, sensors are capturing your behavior in a digital form. Coupled with arguably suspect ‘secure’ anything digital including health information, income statements and Web surfing time, place and duration, the data organization and mining has birthed the emerging field of “collective intelligence”. Welcome.

All that digital information is going to servers and reformatted in data bases are as viral as anything you and a person with a lot of letters after their name can even imagine. And we are adolescents in this development.

A thick web refers to what our ubiquitous use of the web has brought us worldwide; data, tetra-terabytes of it daily. Collective intelligence practitioners acknowledge that their tools will create a sci-fi future on a level Big Brother on cocaine could not have dreamed of. In fact that is, the ‘thing’ about what is going on; we have no idea how we, our families, company, city, nation are going to be impacted as this approach to information finds every nook and cranny of everyone’s life. Stopping it is not an option.

Collective intelligence will make it possible, not probable, for insurance companies, employers, pharmaceutical companies to use data to covertly identify people with an identified gene, profile, affliction, etc., and deny them insurance coverage, employment or bank loans. They can also use it to snuff out an epidemic just as covertly. I wonder where the value of this will be positioned? The government through their budgeting selections can assist law enforcement agencies to identify opposition member’s behavior by tracking scanning, tracing public and private social networks (our old friend the Patriot Act has morphed while we worried about our 401K and “the wars”).

“Pernicious” means exceedingly harmful. Pernicious implies irreparable harm done through evil or insidious corrupting or undermining <the claim is that pornography has a pernicious effect on society>

Now, today, we have and are using the capability to assess a person’s behavior with reality mining of data and then interpret that profile without monitoring him or her directly, talking to him or her, or “knowing’ them. Does Kroger care if you are in a bad mood when you scan your value card? Is the East TX toll reader interested in your reasons for being there mid afternoon? Does Macy’s want you to buy only brand X and not brand Y at the same price? They all care about your behavior, not your feelings and emotions, and intentions and, and, and…

It is a mashup! People and organizations interacting with one another through multimedia digital means will never be less than it is today; it will always be more. Those interactions dynamically leave traces of that ‘behavior’. This allows scientists, the Mafia or over zealous investigative reporters, for example, and anyone with the technology access to the databases to study and learn about the behavior of those traced without the knowledge or consent of the people and groups being scanned. Techniques like that are thought to infringe on the individuals and groups being traced for commercial benefit of those that have that technology over the individuals, groups of individuals and commercial entities that don’t have that technology.

What’s more, if you or your group doesn’t want to be scanned, traced or digitally followed, you have little to say in the matter. Take the instance of “opting out” that’s put forth as a counter measure today for not being a target for spyware, spam and behavioral marketing… “Opting out” is another way some companies validate a cautious web user. For some it means that a different level of secrecy is needed for those that understand counter-control methods. If you want to be removed from lists you have the following troubles (WHICH ARE ALSO TRUE OF REALITY MINING AND COLLECTIVE INTELLIGENCE EFFORTS);

  1. You don’t know where your data is; multiple servers, entities, in the “cloud”
  2. Once in the web, your data is not a ‘thing’ it is a “byt-pat”, a partial byte sequence and a partial pattern
  3. You don’t know who owns it legally
  4. You can’t catch up to it and kill it
  5. Your intimidation by technology is being used against you
  6. You have no counter-control outside the boarders of your country
  7. Your hope is that it isn’t true of you and your data
  8. Arguments favoring scanning and autonomous tracing are wrapped in virtuous rationales

a. fighting disease: SARS, flu, etc

b. fighting terrorists: real and imagined

c. helping the sick and elderly

d. child safety: fear reduction

The reality of reality mining is that your data collected by all of these methods are like a thought you had last night when watching TV: you can’t get at it now, you can’t know exactly where it is located in your head, and once you get it back after going through some mental gymnastics it is not the same as it was when you first had it.

Every day privacy becomes more of a myth than it was even last weekend during the USC game when the water company could tell – they have the data – when the half time occurred due to a drop in the water levels in an eight minute period. We expect that the water will be there and it was. We expect that no one was watching but what “watching” means is changing. It is changing really fast and in ways no one at MIT, Bureau of the Budget and Management or the Justice Department can predict or control. The steaks for success are high. Kroger is working on it.

Are you ready for a wild ride on a roller coaster in the dark without handrails? That is what’s coming here.

Read Full Post »

There’s a new (but old meme) making it’s way around the web via Facebook Notes/status, Forums, Blog posts and comments.

See how far it’s burrowed into the web. Or try technorati if you like.

The game is nothing more than printing a quote from a book according to some rules and passing it along.

Here’s a quote from me:

“Suppose we do another version of the calendar analysis we did in the previous chapter with hockey players, only this time looking at birth years, not birth months”

Rules:
* Grab the book nearest you. Right now.
* Turn to page 56.
* Find the fifth sentence.
* Post that sentence along with these instructions in a note to your wall.
* Don’t dig for your favorite book, the coolest, the most intellectual. Use the CLOSEST

Where did this start?

Why are we all passing it on?

What interesting data is there in all this?

What’s the mostly widely kept at our sides book right now?

What about this meme works where others haven’t?

Read Full Post »


We are about to go into a “phase” as my mother used to refer to my dependence on a specific set of terms, ideas or behavior. That phase is going to be around for awhile and it’s going to drive us nuttier than a 4 year presidential election campaign. It is a ‘racism’ phase and it will end up giving the neocons apoplexy, the centrists gout, and the make the left wingnuts tongue-tied.

No. racism doesn’t have to be black vs. white. If you think that you haven’t had to be paying attention. It can be British vs. French, Indian vs. Pakistani and a hundred other real or imagined lines of descendence that we think makes a difference. It is all the same racism.

And it is now open verbiage to be applied to uncritical street cred and maniacal approaches to everything imaginable. You are racist if you want secure boarders. You are a racist if you don’t care one way or another. You are racist if you want a strong UN. You are raciest if you want your children to go to public – private – parochial or military school (pick).

As for those that think we have a black President it must be dead, pleassssseeee! Spend some time in the south, in the north, east, west, in prison, on a cruise ship, in a space program…

Know racism is not dead. It is not even sick. It is alive and well and willing to come out with a joke, a glace, a decision to stay home from the blues concert. It is euphemism expressed openly. It is seen in almost any non- SNL skit where “nudge – nudge, wink – wink” says it all. It is at every eatery you drive thru or opulently satiate yourself at as well as every level of management seeking a vision to hide our primitivism. It is convoluted, reverse, covert, overt and illegal. It is in front of you and it is behind you. Worst of all, it will never die any more than fear that generates it will be die.

In the end, it is an abject announcement that ‘my cave is better than your cave’ and that I am more in charge of my future if and only if I can make some entity less so that I will be more. It is yours for use as long as you are willing to accept ‘Diversity training’ for your transgressions. It is yours for use as long as you can point to your involvement in affirmative action pamphleteering or some absurd “Guilt-be-gone!” behavior equivalent to two wrongs make a right.

As complex interacting humans that are endlessly pressured to define who we are, we need to call people on their use of racism, and be called on ours, in communications of all sorts. Just ask, “What do you mean?” Otherwise, its easy use will morph to an all encompassing modifier to give credence to our position or shock value to our voice.

So, now we need to turn our phasers on and be ready to shoot down, stun or punctuate any conversation or ink that rides the coattails of an Obama victory to profess their new slant on bigotry.

Read Full Post »

I present to you a principle of mine relevant to releasing websites and web software.

You Don’t Really Know Until It’s Live Principle

Basic Idea:

No one Looks At Anything Until it’s Publicly Released.  Then it’s a frenzy of real feedback.

The implications:

  • There is no better QA that real users on real software on real hardware in the real world
  • You can’t experience software unless you get the full experience in the real world, real set up (and I mean EXPERIENCE, not test or click or review)
  • the Halting Problem applies big time – that is, you won’t know what breaks a site/service/software until something breaks it.  Even the best unit tests and XP efforts won’t uncover all the halts

It’s been true for the last 78 pieces of software I’ve worked on…

Why does this principle hold?

a) the consequences (the stakes!) are very high when a piece of software is LIVE.  Thus it is very reinforcing for people to give feedback and really dig in. (oh shit, it’s live!)

b) Technology obstacles and lots of caveats usually hold in prototypes, mocks and dev sites (oh, ignore that link, we didn’t get to that yet)

c) Websites are very complicated, especially ones where you have lots of mixed media, complete design overall, a new backend, aggregation, and so forth.  Mocks can never showcase the full experience and experiential bugs are impossible to uncover unless you are in the flow.

So spare yourself the agony of deciding when to release or trying to be perfect on public release.  Just release.  You’ll get on with the fixing and improvement cycle sooner.

Read Full Post »

http://www.nytimes.com/2008/10/28/opinion/28brooks.html?_r=1&scp=2&sq=&st=nyt&oref=sloginfree registration to read if not registered…

Go with the as postulated in this NYT.com article, there are four steps to every decision…

  1. you perceive a situation
  2. you think of possible courses of action
  3. you calculate which course is in your best interest
  4. you take the action

&^+%$!!)*?<#!

If only it were that simple.

Over the past few centuries, public policy pundits, talking heads and some academicians have presumed that step three was the most important. Social science disciplines are premised on that presumption as well; despite the ink used to propagate altruistism at every opportunity, people calculate and behave in their own self-interest.

Greenspan’s quoted in the above article made that clear for his reign and for the country. His comments aside, none of the steps above are worth a lot without the others.

Most of the processing takes place without literal awareness. We behave and when pressed for why, we generate a story that fits that situation and puts us in a virtuous light. We don’t really perceive all that well. Thus, the step that seems most simple is the most complex. Looking at and perceiving the world is an active process of symbol meaning-making that shapes and biases the rest of the decision-making chain.

Psychologists have been exploring our biases for four decades with the work of Amos Tversky and Daniel Kahneman, and also with work by people like Richard Thaler, Robert Shiller, John Bargh and Dan Ariely. Now Brooks would have it that it is time for the economists to contribute. Gasp!

The desperation of the day may mean a new wave of behavioral economists and others who are next to bring pop psychology to the realm of public policy. These are the same pundits that used their antiquated assumptions to provide plausible explanations for why so many others are wrong about risk behaviors and globalization implications.

Nassim Nicholas Taleb for instance. In his books “Fooled by Randomness” and “The Black Swan” he explains it all in equally simplistic manner as the four rules above. As an astute colleague pointed out bluntly, we are asking the guy who coined the perception of “black swans” to predict black swans.” The irony is laughable. What gives a black swan example its value is that it is not obvious [read predictable]. While Taleb may have seen it coming, as stated in the above article, that precludes it from being an example of a “black swan” phenomenon. Irony for sure.

When Taleb gets on the philosophical diving board to spring into evolutionary causation decreeing that humanoids brains evolved to fit a less complex world I found myself gagging instead of gasping. His examples of the perceptual biases that distort our thinking are themselves century old prejudices.

1. Our tendency to see data that confirm our prejudices more vividly than data that contradict them

a. We recognize information due to its relation with exiting cues we have in our repertoire. We don’t see what we haven’t been reinforced to see; it is not self-deception any more than it is self enlightenment when we see what turns out to be correct. In that set of circumstances the correctness is not based on enlightenment but on relationships that were there all along but not focused on, recognized or reinforced by the environment.

b. That environment is the same one where superstition, myth, magic, mind and phenomenalism is considered valuable to be our “humanity” and, knock on wood, we sometimes guess right despite the reasons behind the guess.

2. Our tendency to overvalue recent events when anticipating future possibilities

a. the last 6 months is more like the next six months than the last 1000 years are like the next 6 months

3. Our tendency to spin concurring facts into a single causal narrative

a. if for no other reason, this site is the mainstay of the defeat of monocausality which haunts our culture, bolsters our superstitions and keeps us surprised at regular intervals

4. Our tendency to applaud our own supposed skill in circumstances when we’ve actually benefited from dumb luck.

a. We benefit from historical uniqueness and education that is more than smattered with scientific skepticism as opposed to boorish cynicism that ignores our strengths and panders to the voodoo in the caves.

b. See 1.-b above.

Errors of perception are everywhere when experimental analysis is NOT involved. Clearly, getting to our moon and beyond was due to experimental analysis and NOT interpretation of perceptions of pundits.

Without experimental analysis we’ll continually fail to perceive “what’s going on out there.” The relationships between a zillion things and another zillion things are to complex. While a four point decision tree helps us walk across the street in a small town, it is not the way to figure out how to navigate rules of this years tax code or interpret the Patriot Act I or II on any given Sunday. Who knew and who still knows which small events are linked to big disasters? Who knew that the mechanical Newtonian links were there as well as selected consequences of a billion factors coming together world [pick one] (cause – contribute – accompany) a social-political-economic unraveling? Experimental analysis was not involved. Interpretation of biases was.

Faulty perceptions are not the only reason or application for an experimental analysis. Relationships are complex, not caused by single small or an enormous events as you have been trained to think. We don’t have much training to recognize or understand what our own self-interests are in anything but localized strings of spatial-temporal events. Brooks’ towing with trusting government to become engaged in the process is folly. Just how much “help” can a country endure? What’s worse, it is lazy. Separating government and business is impossible but collusion is asking for our own demise handed to as a coupon toward irrelevance. While we regularly make poor decisions, the government is insensitive to making the correct one or those needing to be made in a timely fashion.

If you doubt that, don’t look in the rear view mirror as some would suggest. Follow the consequences of a potential decision and determine for yourself if you or an agent of an ideology is better suited to care for what is in your best interests. Government information feedback mechanisms are limited, broadly myopic, and mechanical; not timely. The very thing that got them away from the citizenry to be politicians has had ideology numb them contributing to an end to pragmatism. This bias, to be sure, is no better or worse than any other bias. They all can be replaced with an experimental analysis from science rather than the pop pap solutions we are offered.

As we’ve seen from recent crashes before the latest one this set of economic biases just keeps on giving. It keep on giving us the problems that government is content to continue to administer to; mindfulness, equality of everything not equal, brinkmanship over leadership and above, all, saying what works to get re-elected. As stated, this meltdown is a cultural event reminding us that we are perceptive beings, seeing things that aren’t there and not perceiving things that are there. (See previous blogs]


Read Full Post »

Older Posts »