I made a mistake a long time ago that I am still paying for.
I have come to this conclusion over the last couple of months as we look at our support of RealMedia. To put it simply -- RealMedia is a dead end at this point. I can't imagine why anyone would want to put content they are creating into this format, and worse, force their audience to install yet one more player that is notorious for taking over playback of everything on your computer.
It was a necessary decision because we lived in the days of 56k modems, and we wanted to deliver audio and low bitrate video. It was a remarkable achievement that we could deliver video at that time that wasn't very good -- but it worked!
On my campus, we face a similar dilemna now. My organization doesn't offer a followup solution to RealMedia. We don't offer a way for people to ingest media, control access, automate workflows.
Unfortuately, our campus does support something that does -- but it is couched in a framework that is 100% Windows technology. It uses silverlight, but can play back windows media as well. The worst part is that the content lives in this framework, never to get out. In this regard it is even worse than RealMedia because there is no exit strategy. At least realmedia content can be played without a server.
But we live in a vacumn on our campus, so it is a viable solution. I don't blame people at all for adopting it -- because there is little else other than iTunes U that they can use. I have to watch helplessly as the migration begins to a single vendor solution, with little or no hope of mobile playback, housed in a container that is every bit as proprietary as RealMedia.
Most people simply don't care. I have written at great length in the past about the concern I have about content we create not being playable in the next 10 years. This has already happened with very old RealMedia content -- it will not work in the latest RealMedia player. I am sure that Microsoft won't make a similar mistake.......oh wait -- there was this technology called Indeo -- a codec for Windows Media -- I have some of that content on my laptop right now -- and I can't figure out a way to play it back -- or at least convert it. It is dead and inert.
We are forced into thinking short term -- how can we solve the problem NOW -- with little concern about the future. Ironically, in this age of open standards, for some of the more compelling technologies -- the move is to pull content into a box, and not let it escape. There is by design no exit strategy for this content.
I will spare people all the ramifications -- what happens when the vendor goes bankrupt -- what happens when something better comes along -- and you are stuck (again, RealMedia). The reality is that most people don't care.
It is like we are publishing books that only can be read with a certain device, from a single vendor. Oh wait, Amazon is doing that now.
Friday, May 15, 2009
Tuesday, March 31, 2009
More Kindle
http://www.talkingpointsmemo.com/archives/2009/03/until_quite_recently_id_seen.php
I think he tries to carefully step around the whole "kindle vs book" dichotomy, and I applaud him for that.
I feel the same way about the Kindle as I do about digital reproductions of paintings. They are an adequate facsimile, but the viewer should never confuse the two, and unfortunately many won't ever know what they are missing.
I have spent some time looking at Robert Rauschenberg's paintings. He is known for his layers, combining offset and transfer printed media, found objects affixed to the painting (raising the surface). He overpaints on top of things, creating a kind of damaged opaqueness that I find compelling.
What I have discovered is in looking at digital reproductions, I am missing a lot of that detail. This should come as no surprise. The issue here is that I don't know that I am missing that information, that in photographing it a lot of that information is lost.
In transferring books to a Kindle, we transfer the content, and textual representation. In many cases, it comes down to viewer preference (Kindle vs iPhone vs. Computer Vs. Real Book). All do an adequate job of displaying text.
But I think about books that rely on the characteristics of paper. Edward Tufte's books could be put on a Kindle, but I think it would be a damaged experience. A lot would be lost -- perhaps not the drawings, tables and graphs, but the physical layout that Tufte spent so much time perfecting would potentially vanish. Someone looking at it on a Kindle may never know the difference between the two -- and as a result key information (his layout reflects his central ideas of information organization) would be lost.
I have to be careful here and not sound like a luddite. I think the Kindle is kind of cool. It is just that many of the books I want to read wouldn't really work on the Kindle -- a few would to be sure, but my experimental typography book surely would not.
Of course this will all change soon enough. Display technology will continue to improve. The challenge for designers will be to appropriate from books what makes sense, but embrace what the medium/platform has to offer, forever changing the reading experience.
Will books go away? I think mainstream books will eventually. There will be enclaves that will continue to publish books because of their love of printed matter or because their designs will not transfer successfully to books. It may be akin to the market for high quality reproduction audio versus 256 kb/s MP3's. The mp3's are close enough that most won't know the difference, but they won't ever understand the nuance that may be lost -- or not even care -- given the advantages of lossy compressed audio.
Sorry to keep this so long but this is a central topic with me, what happens to media when it migrates. What is gained, what is lost, how does the physical representation of the media cause it's meaning to change (yeah yeah macluhan). It is important to consider. But it is also inevitable. Fighting it is ridiculous, for it is going to happen. Better to look at what it can do for us than to spend time criticizing it's failures.
So the Kindle will likely spawn more things. I am concerned about the Kindle for a different reason. Their business model pushes back against consumers. Buyers end up with less than they had before, and the content is tethered to a device for viewing. Not even the iPod does that. It is a precedent that I find worrisome.
--hal
I think he tries to carefully step around the whole "kindle vs book" dichotomy, and I applaud him for that.
I feel the same way about the Kindle as I do about digital reproductions of paintings. They are an adequate facsimile, but the viewer should never confuse the two, and unfortunately many won't ever know what they are missing.
I have spent some time looking at Robert Rauschenberg's paintings. He is known for his layers, combining offset and transfer printed media, found objects affixed to the painting (raising the surface). He overpaints on top of things, creating a kind of damaged opaqueness that I find compelling.
What I have discovered is in looking at digital reproductions, I am missing a lot of that detail. This should come as no surprise. The issue here is that I don't know that I am missing that information, that in photographing it a lot of that information is lost.
In transferring books to a Kindle, we transfer the content, and textual representation. In many cases, it comes down to viewer preference (Kindle vs iPhone vs. Computer Vs. Real Book). All do an adequate job of displaying text.
But I think about books that rely on the characteristics of paper. Edward Tufte's books could be put on a Kindle, but I think it would be a damaged experience. A lot would be lost -- perhaps not the drawings, tables and graphs, but the physical layout that Tufte spent so much time perfecting would potentially vanish. Someone looking at it on a Kindle may never know the difference between the two -- and as a result key information (his layout reflects his central ideas of information organization) would be lost.
I have to be careful here and not sound like a luddite. I think the Kindle is kind of cool. It is just that many of the books I want to read wouldn't really work on the Kindle -- a few would to be sure, but my experimental typography book surely would not.
Of course this will all change soon enough. Display technology will continue to improve. The challenge for designers will be to appropriate from books what makes sense, but embrace what the medium/platform has to offer, forever changing the reading experience.
Will books go away? I think mainstream books will eventually. There will be enclaves that will continue to publish books because of their love of printed matter or because their designs will not transfer successfully to books. It may be akin to the market for high quality reproduction audio versus 256 kb/s MP3's. The mp3's are close enough that most won't know the difference, but they won't ever understand the nuance that may be lost -- or not even care -- given the advantages of lossy compressed audio.
Sorry to keep this so long but this is a central topic with me, what happens to media when it migrates. What is gained, what is lost, how does the physical representation of the media cause it's meaning to change (yeah yeah macluhan). It is important to consider. But it is also inevitable. Fighting it is ridiculous, for it is going to happen. Better to look at what it can do for us than to spend time criticizing it's failures.
So the Kindle will likely spawn more things. I am concerned about the Kindle for a different reason. Their business model pushes back against consumers. Buyers end up with less than they had before, and the content is tethered to a device for viewing. Not even the iPod does that. It is a precedent that I find worrisome.
--hal
Wednesday, March 25, 2009
The Creative Cycle at Nintendo
Here's an encapsulated, abbreviated excerpt from a presentation today by the CEO of Nintendo (Iwata) and Miyamoto, head game designer for Nintendo. He gets gaming in a way that few others do. He is an inspiration for me for his ability to take life experiences and translate them into art. This excerpt hits many familiar points of the creative process.
Excerpted from:
http://www.joystiq.com/2009/03/25/joystiq-live-from-nintendos-gdc-2009-keynote/#continued
Iwata recounts his history with HAL Laboratory working with Nintendo - when presenting a game to Nintendo, he was told "This is not bad. With a few months, this could become a quality game." Iwata says he was "speechless."
"We did not have a few more months. We had very little time. In fact, we had two more days." Without the revenue, they would report a loss for the year, lose their funding, and enter a "death spiral." Death spiral = Financial pressure + less time + poorer quality, culminating in lower sales.
Iwata restructured the company to make better games - he believed Nintendo was able to make better games because they had more money. Now, he understands this better. The way HAL and Nintendo did things were not at all the same.
Miyamoto's gardening hobby turned into Pikmin; he got a new dog, and that turned into Nintendogs; exercise turned into WiiFit.
Iwata jokes that he asked Miyamoto to never talk about his hobbies outside of work – he's on a 24/7 non-disclosure agreement.
MIYAMOTO'S WAY - "Upward Spiral"
1. Ideas are everywhere
2. Personal Communication
3. Prototype Stage
He shows off a very rough "prototype" of Wii Boxing, encouraging developers to spend more time on the game's mechanics. "The amount of time being spent on the game's graphics was zero. Perhaps you can tell that."
4. Small Teams
5. Multiple Projects
6. Trial and Error
"Sometimes no matter how hard they work, the small teams struggle to meet their objective. That prototype phase can last two years." If they set a project aside, that's the nature of trial and error. "For Mr. Miyamoto, prototype making allows for the most trial and error where the smallest number of developers" can work on the game. "This is one of the most important characteristics of Mr. Miyamoto's approach that I have observed."
"Of course, with so many project going on at one time" some make it beyond that prototype stage. So, the last stage: The Mass Production Stage. Mr. Miyamato, who began as Iwata's mentor, now reports to him. Not always a pleasant scenario.
Once one of Iwata's latest hobbies reaches the prototype stage, he makes it a point to not ask about how it's going. It makes it difficult for Iwata to predict when a product can begin to generate revenue – "which is not very good for my mental health."
If they throw out an idea, that work is not wasted. "I have seen some of these ideas show up years later," Iwata says.
--hal
Excerpted from:
http://www.joystiq.com/2009/03/25/joystiq-live-from-nintendos-gdc-2009-keynote/#continued
Iwata recounts his history with HAL Laboratory working with Nintendo - when presenting a game to Nintendo, he was told "This is not bad. With a few months, this could become a quality game." Iwata says he was "speechless."
"We did not have a few more months. We had very little time. In fact, we had two more days." Without the revenue, they would report a loss for the year, lose their funding, and enter a "death spiral." Death spiral = Financial pressure + less time + poorer quality, culminating in lower sales.
Iwata restructured the company to make better games - he believed Nintendo was able to make better games because they had more money. Now, he understands this better. The way HAL and Nintendo did things were not at all the same.
Miyamoto's gardening hobby turned into Pikmin; he got a new dog, and that turned into Nintendogs; exercise turned into WiiFit.
Iwata jokes that he asked Miyamoto to never talk about his hobbies outside of work – he's on a 24/7 non-disclosure agreement.
MIYAMOTO'S WAY - "Upward Spiral"
1. Ideas are everywhere
2. Personal Communication
3. Prototype Stage
He shows off a very rough "prototype" of Wii Boxing, encouraging developers to spend more time on the game's mechanics. "The amount of time being spent on the game's graphics was zero. Perhaps you can tell that."
4. Small Teams
5. Multiple Projects
6. Trial and Error
"Sometimes no matter how hard they work, the small teams struggle to meet their objective. That prototype phase can last two years." If they set a project aside, that's the nature of trial and error. "For Mr. Miyamoto, prototype making allows for the most trial and error where the smallest number of developers" can work on the game. "This is one of the most important characteristics of Mr. Miyamoto's approach that I have observed."
"Of course, with so many project going on at one time" some make it beyond that prototype stage. So, the last stage: The Mass Production Stage. Mr. Miyamato, who began as Iwata's mentor, now reports to him. Not always a pleasant scenario.
Once one of Iwata's latest hobbies reaches the prototype stage, he makes it a point to not ask about how it's going. It makes it difficult for Iwata to predict when a product can begin to generate revenue – "which is not very good for my mental health."
If they throw out an idea, that work is not wasted. "I have seen some of these ideas show up years later," Iwata says.
--hal
Monday, August 18, 2008
5 Questions before starting any new Project
Wrote these while sitting in a meeting. I have asked these same 5 questions many times before when talking with clients.
I am sure there are more, but these 5 always work for me.
- Who is the Audience?
- What is the purpose?
- What is the expected outcome? (Visualize and describe what the end looks like)
- Who does the work?
- What is the cost (people and money)?
I am sure there are more, but these 5 always work for me.
Wednesday, June 25, 2008
A Posting and it's aftermath
The other day, I saw an instance where the responses to an article where much more interesting than the article itself, and in a way, were hopelessly intertwined.
Responses to "IT is boring say graduates" is custom made to generate traffic on a site like slashdot, or many in IT a daily read. And it certainly did.
First -- the study itself queried 2000 non-IT curriculum students in the UK for their views on job propects, funding and job satisfaction (or perhaps, "fun"). Only 60% said that they wouldn't go into IT because it would be boring.
What was astonishing for me was the vast similarity in responses -- essentially -- that yes, IT (and work in general for many) is boring -- and those young no-nothings better get down with that fact!
First off -- there are many questions to be asked about the data collection itself. That may not sound interesting -- but the story of the data can be quite revealing. Which students? Which colleges and curriculum (2000 students in theatre arts?).
The article was superficial, great headline and little substance. Custom-made for the soundbite delivery of Slashdot, and it's readers. There were some good responses, and of course the usual digressions on to things like capitalism and quality of life.
I think it is a dumb thing to ask this question. The idea of IT as a department will possibly evaporate in the next few years for some folks. There will be layers on layers instead. Certainly we see that now. Customization becomes the rule, not the option.
The idea of a boring job -- wow. I think that there needs to be engagement by the individual, or time is wasted (and we have a finite amount). I recognize the artistry of coding. I see the architecture and it makes me think. To be able to see profundity in everyday things is a valuable thing. I think some of the folks I work with are artists - not in the hokey "cybercowboy art" -- but in the very real classic sense, making objects that have deep meaning and are readily apparent. Good code is like this.
I have more questions than answers for this little bit of fluff of an article. I couldn't spend as much time on the comments as I would have liked, but the negativism I saw may have more to do with expectations and disappointment than what could be possible.
Responses to "IT is boring say graduates" is custom made to generate traffic on a site like slashdot, or many in IT a daily read. And it certainly did.
First -- the study itself queried 2000 non-IT curriculum students in the UK for their views on job propects, funding and job satisfaction (or perhaps, "fun"). Only 60% said that they wouldn't go into IT because it would be boring.
What was astonishing for me was the vast similarity in responses -- essentially -- that yes, IT (and work in general for many) is boring -- and those young no-nothings better get down with that fact!
First off -- there are many questions to be asked about the data collection itself. That may not sound interesting -- but the story of the data can be quite revealing. Which students? Which colleges and curriculum (2000 students in theatre arts?).
The article was superficial, great headline and little substance. Custom-made for the soundbite delivery of Slashdot, and it's readers. There were some good responses, and of course the usual digressions on to things like capitalism and quality of life.
I think it is a dumb thing to ask this question. The idea of IT as a department will possibly evaporate in the next few years for some folks. There will be layers on layers instead. Certainly we see that now. Customization becomes the rule, not the option.
The idea of a boring job -- wow. I think that there needs to be engagement by the individual, or time is wasted (and we have a finite amount). I recognize the artistry of coding. I see the architecture and it makes me think. To be able to see profundity in everyday things is a valuable thing. I think some of the folks I work with are artists - not in the hokey "cybercowboy art" -- but in the very real classic sense, making objects that have deep meaning and are readily apparent. Good code is like this.
I have more questions than answers for this little bit of fluff of an article. I couldn't spend as much time on the comments as I would have liked, but the negativism I saw may have more to do with expectations and disappointment than what could be possible.
Gates Rant Shows He Gets It
We want to believe that Bill Gates is some sort of clueless nerd past his prime (ie "The Road Ahead"), but occasionally he gets it.
Read this rant and see what happens when Bill Gates tries to install MovieMaker.
By the way, did you know there are two versions of MovieMaker for Vista?
Read this rant and see what happens when Bill Gates tries to install MovieMaker.
By the way, did you know there are two versions of MovieMaker for Vista?
Thursday, June 12, 2008
The upcoming HD revolution
Tracking the cost of HD cameras has been a sideline as of late. I am finally in the market, and prices continue to plummet. As I mentioned in my other blog, there is the $200.00 Aiptek A-HD, which with all it's flaws, does indeed shoot HD video.
I think about this because I feel a sense of deja vu. "It's not professional" I heard someone pronounce the other day about a consumer HD camera. Professional Videographers avoided YouTube until it became apparent that they would be edged out by the great, unwashed masses.
I am quite excited by this trend. Watch this music video, which was shot entirely with a Sanyo HD1000.
There are problems to be sure -- chromakeying is more difficult with the type of encoding that is done on these cameras. There may be problems with using software based image stabilization, particularly at 30fps. I can say, despite popular wisdom, that I have managed to get acceptable chromakeys with DV footage. Sure -- I would love to shoot on more expensive formats that have a better colorspace. But it indeed possible. I can even mimic the color look of particular film stock with a $200 piece of software and a little elbow grease.
This stuff puts "close to film" in the sub-$1k region. Let others get excited by the upcoming RED Scarlet. I think the revolution is happening right now.
I think about this because I feel a sense of deja vu. "It's not professional" I heard someone pronounce the other day about a consumer HD camera. Professional Videographers avoided YouTube until it became apparent that they would be edged out by the great, unwashed masses.
I am quite excited by this trend. Watch this music video, which was shot entirely with a Sanyo HD1000.
There are problems to be sure -- chromakeying is more difficult with the type of encoding that is done on these cameras. There may be problems with using software based image stabilization, particularly at 30fps. I can say, despite popular wisdom, that I have managed to get acceptable chromakeys with DV footage. Sure -- I would love to shoot on more expensive formats that have a better colorspace. But it indeed possible. I can even mimic the color look of particular film stock with a $200 piece of software and a little elbow grease.
This stuff puts "close to film" in the sub-$1k region. Let others get excited by the upcoming RED Scarlet. I think the revolution is happening right now.
Subscribe to:
Posts (Atom)