There is only one Apple, and why this is so hard for some to acknowledge

Apple is on a rise. The company enjoyed an incredible growth in the past decade, climbing up from being nearly dead to eventually become the rising star of the tech industry that it is today.

For me as an Apple-enthousiast, this is a good thing. The more people who jump on the Apple bandwagon (either by using an iPod, an iPhone or a Mac), the better. I have explained this before in a previous post: more users means more income for Apple, which might spend even greater amounts on R&D than it does today, resulting in even more an better products for us to enjoy.

I think the growth of Apple’s market share is the direct result of more and more people recognizing the benefits of using its products: its extremely easy to use user inteface, its gorgeous software design, but most of all, the way everything is designed to work seamlessly together, from the hardware to the operating system to its core applications to its web services and now even to its companion devices like the iPhone.

There is only one single company in the entire world that offers its computers as completely in-house designed products: both hardware and software are designed by the same people. This is completely the opposite to how all other computer manufacturer’s work. They buy a generic operating system (which in 99% of the cases is Microsoft Windows), and build a PC using generic parts to run this OS. This leaves them very little room to differentiate themselves from each other, which generally comes done to competition purely on price.

Continue reading

Advertisements

iPhone evolution and how to avoid the Android problem

One of the reasons for the iPhone to be such a well functioning and exceptionally usabale device lies in the fact that, completely in Apple fashion, both hardware and software are made by the same company. This way, the hardware engineers were completely aware of how the software would function, and the software engineers fully knew the ins and outs of the hardware platform, letting both achieve the maximum of what’s possible with the combination. This has worked very well in the past too: just have a look at the Mac to see how a complete package of tightly integrated hardware and software eleminates a lot of problems that occur in the generic PC field, where all software is supposed to work on all possible vendors, types, versions and variants of hardware components in countless possible combinations.

Next to the obvious usability advantages for end users, having a clear combined hardware/software platform is also a very nice thing for developers. Knowing exactly the device that your software will eventually run on gives a developer some of the same benefits: he or she can take maximum advantage of the platform, without taking the risk that something would not work, or work differently, on another type of device. You know the capabilities and limitations of the platform, and you do not have to guess what features might possible be there, or worse: what featurs might be missing and how to deal with such a situation.

Continue reading

Apple to embrace DisplayPort, now what?

With the introduction of the new gorgeous “unibody” all aluminum 13″ and 15″ MacBooks, Apple made the move to DisplayPort in favour of DVI. What gives?

Despite the generic sounding name, DisplayPort is a new standardized connector and protocol designed to connect computers to digital displays. It was developed by VESA, a group of companies working on defining various display-related technologies since the Super VGA era in the 80s.

In many respects, DisplayPort is a competitive technology to DVI and HDMI. The DisplayPort group claims various technical advantages over DVI, such as the protocol being packet based (similar to the TCP/IP protocol that is driving the Internet and most other networks), it is scalable so that it can be enhanced in the future without breaking compatibility, and it can daisy-chain multiple displays over 1 connector at the computer’s end. Most importantly, they claim lower cost, due to the lack of a step-in fee (like the $10,000 required for HDMI). And because of technical reasons that go beyond the scope of this blog and certainly my technical expertise, it requires less components in a display monitor, as the digital video format can be sent directly to the LCD panel, further reducing cost.

However, most of these improvements are bearly real advantages to general users, and I expect more political reasons to be the real motivator fot its supporters to push this standard over the DVI and HDMI conntectors. A different share of the IP fees and licensing are more likely reasons.

Continue reading

YouTube higher quality video and stereo sound

As you might be aware, videos uploaded to YouTube are usually of a higher original quality than what the site shows you in the video window. Video and audio are transcoded to a low quality Flash format, resulting in a fairly low quality video and low quality mono audio. I expect YouTube made this trade-off in order to lower the server load at their end (and as such reducing cost), but one might even argue that a lower bit rate results in a better user experience for the user when he or she is using a slow connection (faster loading times, less hickups, altough this is becoming an increasingly smaller problem).

However, it seems that YouTube does in fact store the original uploaded video material and not only the low quality Flash video that is presents to its users by default. This became clear when Apple introduced its iPhone in June of 2007, and with the Apple TV firmware update of that same month. Both devices cannot play Flash, however they both allow viewers to watch YouTube content. For this to work, YouTube offers the video in AVC or H.264 format. And since older videos were also made available to users of these Apple products, one can assume that YouTube saved the original videos in order to do the transcoding.

As a nice side effect, videos played back on these devices looked better than the ones on the YouTube.com web site accessed from a computer, as H.264 is a fairly powerfull, efficient and high quality codec.

Continue reading

The wide screen saga

Wide screen television sets were introduced in Europe around 1992. At that time, there were hardly any wide screen broadcasts, so in order to move these new wide TV sets, manufacturers included all kinds of artificial picture scaling technologies into their products, stretching out the image just so that the screen is “filled”, and no black bars are visible at the sides of the screen. Of course, even though anyone serious about viewing moving images “the way they are intended” shudders from the idea of sacrifising picture aspect ratio, I can understand that in those early days without any wide screen content available, such technologies were needed just to get the wide screen TV ball rolling.

Then came DVD in early 1997. The first mainstream video format to offer real, anamorphic, wide screen video. The DVD specification cleverly defines that a player must be capable of compressing and letterboxing a wide screen DVD when a traditional 4:3 is connected, while outputting the uncompressed, full frame wide screen image to a wide screen TV. Unfortunately, the type of TV set that is connected to the player needs to be manually selected by the user. Ususally, this setting is burried deep down into the player’s setup menu. And since outputting an uncompressed anamorphic image to a traditional TV would result in deeply distorted pictures, but outputting a compressed letterboxed image to a widescreen TV does no harm to the aspect ratio, all manufacturer’s decided to set the player to “4:3 TV” by default.

Continue reading

A degree in picture management

Only recently, I took the plunge and bought myself my first Blu-ray Disc player, a Sony BDP-S350. I waited specifically for this model, for two reasons. One, I wanted the player to be BD-Live (or Profile 2.0) compliant, meaning that it is equiped with a network-connector allowing certain BD titles to access the Internet and enhance the movie playback with online content. And second, I wanted my Blu-ray player to be a Sony, because as the main supporter of the Blu-ray Disc format, I expect Sony to provide the best support in terms of firmware updates, making my investment as future proof as possible.

I hooked up the player to my Philips Full HD LCD television, which is about one year old. Much to my surprise, the picture quality of a Blu-ray title (in this case the magnificant documentary “Earth”) did not overwelm me in the way I expected. Specifically, the picture contained, in my opinion, a lot of musquito noise in darker areas, and also the movement was a little jittery. When trying a DVD, I noticed some of the same effects: noise and lack of sharpness, and not perfect motion. Of course this qualification might be due to me being over sensitive to video quality, however I was pretty sure that both the TV and this generally well reviewed player should be capable of delivering more. Especially since the picture quality of my relatively cheap 1080p upscaling DVD player was free from these effects when used with my TV. So I was determined to finetune the new player and the TV to get the results I expected.

Continue reading