AtHomeBoy_2000
Aug 6, 11:46 AM
Mac OS X Leopard
Introducing Vista 2.0
http://www.flickr.com/photo_zoom.gne?id=207241438&size=l
That's funny. A nice little jab at M$. Classic!
Introducing Vista 2.0
http://www.flickr.com/photo_zoom.gne?id=207241438&size=l
That's funny. A nice little jab at M$. Classic!
peharri
Jul 14, 03:11 PM
Some of this makes sense, some of it not.
I think AppleInsider is right about the case. With the exception of the MacBook, whose design has been rumoured for years and clearly was something Apple would have done even had this been the "iBook G5", Apple has made it a point with all of their Intelizations to use the same case as the predecessor, as if to say "It's business as usual, all we've changed is the processor." So from that point of view, the PowerMac G5 case being, more or less, the Mac Pro case, makes a lot of sense.
Two optical drives? No, sorry, not seeing the reasoning. The reasons given so far don't add up:
- copying DVDs - you can't legally copy 99% of DVDs anyway, if there was no need for twin CD drives, why would there suddenly be for DVDs?
- burning two at once - few people need this, and it's a great sales opportunity for a Firewire external burner anyway. Hell, why stop at TWO?
- Blu-ray - not unless they're really screwed up BR and drives with BR will be incompatible with existing media or something.
Against this, you have the confusion generated by a Mac with two optical drives. I have a Mac with two optical drives (an in-built combo drive, and a FW DVD burner), and it's not terribly elegant. It's fine when reading disks (obviously), but writing them generates some confusion. How sure am I that I'm burning to the right drive? I'm not saying you can't do it, I'm just saying this would be unbelievably un-Mac like. It'd be like the next version of iTunes coming with a menu at the top of its window.
It's also kind of easy to see where this rumour might have originated, in some garbled communication where the rumourmonger says "Two optical drive formats", or "Two bays", or "Multiple media readers" (hey, why not put an SD/CF/MS reader on the front? Pretty much everyone uses them these days, especially the prosumer-market Apple is after. Bet there are more people who'd use an SD card reader than a Firewire port.)
I've been wrong before, but I'm going to go for a traditional PowerMac G5 enclosure, and a single optical drive which may, or may not, support Blu-ray in some shape or form.
I think AppleInsider is right about the case. With the exception of the MacBook, whose design has been rumoured for years and clearly was something Apple would have done even had this been the "iBook G5", Apple has made it a point with all of their Intelizations to use the same case as the predecessor, as if to say "It's business as usual, all we've changed is the processor." So from that point of view, the PowerMac G5 case being, more or less, the Mac Pro case, makes a lot of sense.
Two optical drives? No, sorry, not seeing the reasoning. The reasons given so far don't add up:
- copying DVDs - you can't legally copy 99% of DVDs anyway, if there was no need for twin CD drives, why would there suddenly be for DVDs?
- burning two at once - few people need this, and it's a great sales opportunity for a Firewire external burner anyway. Hell, why stop at TWO?
- Blu-ray - not unless they're really screwed up BR and drives with BR will be incompatible with existing media or something.
Against this, you have the confusion generated by a Mac with two optical drives. I have a Mac with two optical drives (an in-built combo drive, and a FW DVD burner), and it's not terribly elegant. It's fine when reading disks (obviously), but writing them generates some confusion. How sure am I that I'm burning to the right drive? I'm not saying you can't do it, I'm just saying this would be unbelievably un-Mac like. It'd be like the next version of iTunes coming with a menu at the top of its window.
It's also kind of easy to see where this rumour might have originated, in some garbled communication where the rumourmonger says "Two optical drive formats", or "Two bays", or "Multiple media readers" (hey, why not put an SD/CF/MS reader on the front? Pretty much everyone uses them these days, especially the prosumer-market Apple is after. Bet there are more people who'd use an SD card reader than a Firewire port.)
I've been wrong before, but I'm going to go for a traditional PowerMac G5 enclosure, and a single optical drive which may, or may not, support Blu-ray in some shape or form.
ivan2002
Apr 6, 02:18 PM
No matter what Apple does lately or how much they sell or how good the forecasts are for sales Apple Stock continues it quick downward slide. What the HELL!! I just do not understand it ... Specially while Google stock continues to climb at an incredible pace week, after week, after week.. :confused::confused::mad:
I often wonder how do people make money in the stock market. Then I read something like this and remember: off of people who try to play that game without having any idea what it is about.
It's like thinking that the only skill necessary to win in poker is the ability to figure out the strength of your hand. It's not just that "average Joes" trying to play "investors" are unable to tell who the sucker is (it's them), it's that they don't even know that there is supposed to be sucker!
I often wonder how do people make money in the stock market. Then I read something like this and remember: off of people who try to play that game without having any idea what it is about.
It's like thinking that the only skill necessary to win in poker is the ability to figure out the strength of your hand. It's not just that "average Joes" trying to play "investors" are unable to tell who the sucker is (it's them), it's that they don't even know that there is supposed to be sucker!
KnightWRX
Apr 7, 10:46 AM
but to say that intel forced apple to use the IGP is not correct imo.
No indeed, it's not. Intel forced the whole OEM industry to use their IGP, not just Apple. ;)
No matter how you slice it, for some applications, IGPs make sense. Intel cut out the competence from that market with their shenanigans. And now the consumers pays for it with sub-par graphics processors.
No indeed, it's not. Intel forced the whole OEM industry to use their IGP, not just Apple. ;)
No matter how you slice it, for some applications, IGPs make sense. Intel cut out the competence from that market with their shenanigans. And now the consumers pays for it with sub-par graphics processors.
LethalWolfe
Apr 10, 10:31 PM
Unless, like I posted earlier, the iPad app functions as a UI for the main application over the network. The Mac (or cluster of macs) takes care of the heavy lifting, and the iPad is used to make edits remotely, and broadcast to HDTV's.
AirPlay & AirEdit.
If you had a cluster of Mac Pro's using thunderbolt (or whatever...ethernet, fibre, etc) to talk to each other, and you used the iPad as a remote UI, you could edit, compress, and broadcast from anywhere.
Apple has all the pieces in place to do this. AirPlay, AppleTV, iPad, iTunes as a media hub for all the devices to communicate, Qmaster, etc...
This has been a long time coming. I remember in 2006-2007 hearing rumors that Apple was working on a tablet like controller for logic. It was to be used to edit the timeline, or act as a virtual mixer, etc. This has been brewing for years, and I think it's almost a reality.
Avid demo'd basically this last year at NAB. IIRC all the media was on servers in Virginia and the presenter did the demonstration on a laptop using a web app.
Lethal
AirPlay & AirEdit.
If you had a cluster of Mac Pro's using thunderbolt (or whatever...ethernet, fibre, etc) to talk to each other, and you used the iPad as a remote UI, you could edit, compress, and broadcast from anywhere.
Apple has all the pieces in place to do this. AirPlay, AppleTV, iPad, iTunes as a media hub for all the devices to communicate, Qmaster, etc...
This has been a long time coming. I remember in 2006-2007 hearing rumors that Apple was working on a tablet like controller for logic. It was to be used to edit the timeline, or act as a virtual mixer, etc. This has been brewing for years, and I think it's almost a reality.
Avid demo'd basically this last year at NAB. IIRC all the media was on servers in Virginia and the presenter did the demonstration on a laptop using a web app.
Lethal
laurim
Apr 25, 02:47 PM
If the chicken littles had any idea how transparent and documented their lives already are, they would never leave the house. It amazes me how many people think "other people" are trying to find out what they do in their mundane lives. Some egos!
I hate to think that a decent way to track potential terrorist movements was ruined by all of this bs. Imagine how much good information could have been had if a terrorist was arrested and his cell phone record scanned to find out where other terrorists are meeting. But no, you people have to tell them to delete the file. Thanks!
I hate to think that a decent way to track potential terrorist movements was ruined by all of this bs. Imagine how much good information could have been had if a terrorist was arrested and his cell phone record scanned to find out where other terrorists are meeting. But no, you people have to tell them to delete the file. Thanks!
MacSawdust
Aug 26, 10:40 AM
This nows explains why mine is not valid.
bokdol
Aug 18, 09:22 AM
hey bokdol, you and i can start a business and help all the intel mac pro users dispose of their old G5 power macs
we can go into business :)
i'm in
we can start today
we can go into business :)
i'm in
we can start today
edenwaith
Jul 14, 04:34 PM
ONLY DDR2-667?!? :confused:
Come on Apple, you'd BETTER use DDR2-800 or I'll be pissed! :mad:
No, they better equip every new Mac with 10 Terabytes of DDR9-5000 RAM! And they will also include a Raid 5 configuration at 20 Exabytes! And the entire machine will be smaller than your fingernail.
But it will then come equipped with a 16Mhz Motorola 680x0 chip.
Come on Apple, you'd BETTER use DDR2-800 or I'll be pissed! :mad:
No, they better equip every new Mac with 10 Terabytes of DDR9-5000 RAM! And they will also include a Raid 5 configuration at 20 Exabytes! And the entire machine will be smaller than your fingernail.
But it will then come equipped with a 16Mhz Motorola 680x0 chip.
macbookmike
Apr 6, 06:00 PM
please, please, P...L...E...A...S...E - Can we have an integrated Cellular data chip
Reach9
Apr 11, 05:22 PM
Ah, so most of the stuff on Android is "better" only because it's on a bigger screen? :rolleyes:
So if Apple came out with a 6" iPhone, that would make it better than Android, right?
And the navigation app I purchased houses all the map data on the device and doesn't rely on a data connection to operate. Unlike Android's stock navigation.
Um, how about the entire OS?
There are also people (like me) who prefer not to carry something the size of an old-school Palm Pilot in their pocket.
Clearly you missed out how Multitasking, and Notification system is better. And yes, size does matter. If Apple came out with a 4" phone it would be amazing, but still wouldn't be better than Android unless they fix issues like notification system.
Good for you, i like the fact that I don't have to buy an expensive app for something which comes free on another device. But here's the deal, for argument sake i didn't count apps from the App Store or Android App Store. So the stock application Maps on the iPhone is completely premature compared to the Google navigation on an Android.
You're just proving my point.
Right Android based their OS from iOS. But they have surpassed iOS in regards to usability as a smartphone.
When Steve Jobs announced iOS in 07, he said that the OS was 5 years ahead of it's time. Well, he definitely proved it, but 4 years later there are amazing OS around, definitely isn't ahead of its time anymore.
I believe not all the Android phones are massive, you don't have to generalize. The following picture should make things clear:
http://4ucellphone.com/wp-content/uploads/2011/02/iphone-4-samsung-galaxy-s-htc-desire-screen-size-compare-580x365.jpg
iPhone 4. Samsung Galaxy S. HTC Desire.
I think the point you're missing is that i can also enjoy these features you're stating with my iPod Touch, and i'll still be able to enjoy the true smartphones, the Android phones.
Anyway, this is my own opinion, you can keep your fanboy perspective as well. Like i said, we don't have to agree.
Who knows? Maybe iOS 5 and iPhone 5 will surprise us all (in a good way). And then i won't be switching.
So if Apple came out with a 6" iPhone, that would make it better than Android, right?
And the navigation app I purchased houses all the map data on the device and doesn't rely on a data connection to operate. Unlike Android's stock navigation.
Um, how about the entire OS?
There are also people (like me) who prefer not to carry something the size of an old-school Palm Pilot in their pocket.
Clearly you missed out how Multitasking, and Notification system is better. And yes, size does matter. If Apple came out with a 4" phone it would be amazing, but still wouldn't be better than Android unless they fix issues like notification system.
Good for you, i like the fact that I don't have to buy an expensive app for something which comes free on another device. But here's the deal, for argument sake i didn't count apps from the App Store or Android App Store. So the stock application Maps on the iPhone is completely premature compared to the Google navigation on an Android.
You're just proving my point.
Right Android based their OS from iOS. But they have surpassed iOS in regards to usability as a smartphone.
When Steve Jobs announced iOS in 07, he said that the OS was 5 years ahead of it's time. Well, he definitely proved it, but 4 years later there are amazing OS around, definitely isn't ahead of its time anymore.
I believe not all the Android phones are massive, you don't have to generalize. The following picture should make things clear:
http://4ucellphone.com/wp-content/uploads/2011/02/iphone-4-samsung-galaxy-s-htc-desire-screen-size-compare-580x365.jpg
iPhone 4. Samsung Galaxy S. HTC Desire.
I think the point you're missing is that i can also enjoy these features you're stating with my iPod Touch, and i'll still be able to enjoy the true smartphones, the Android phones.
Anyway, this is my own opinion, you can keep your fanboy perspective as well. Like i said, we don't have to agree.
Who knows? Maybe iOS 5 and iPhone 5 will surprise us all (in a good way). And then i won't be switching.
zoran
Oct 14, 02:50 PM
well its said that clovertown will be here early is that early/late november or early/late december, any new rumors regarding this subject?
Iconoclysm
Apr 19, 08:24 PM
WRONG! They weren't invented at Apple's Cupertino HQ, they were invented back in Palo Alto (Xerox PARC).
Secondly, your source is a pro-Apple website. Thats a problem right there.
I'll give you a proper source, the NYTimes (http://www.nytimes.com/1989/12/20/business/xerox-vs-apple-standard-dashboard-is-at-issue.html), which wrote an article on Xerox vs Apple back in 1989, untarnished, in its raw form. Your 'source' was cherry picking data.
Here is one excerpt.
Then Apple CEO John Sculley stated:
^^ thats a GLARING admission, by the CEO of Apple, don't you think? Nevertheless, Xerox ended up losing that lawsuit, with some saying that by the time they filed that lawsuit it was too late. The lawsuit wasn't thrown out because they didn't have a strong case against Apple, but because of how the lawsuit was presented as is at the time.
I'm not saying that Apple stole IP from Xerox, but what I am saying is that its quite disappointing to see Apple fanboys trying to distort the past into making it seem as though Apple created the first GUI, when that is CLEARLY not the case. The GUI had its roots in Xerox PARC. That, is a FACT.
http://upload.wikimedia.org/wikipedia/en/7/78/Rank_Xerox_8010%2B40_brochure_front.jpg
Actually, you're WRONG!!!! to say he's wrong. You're trying to say that every GUI element was created at Xerox? EVERY one of them? Sorry, but your argument here is akin to something Fox News would air.
Secondly, your source is a pro-Apple website. Thats a problem right there.
I'll give you a proper source, the NYTimes (http://www.nytimes.com/1989/12/20/business/xerox-vs-apple-standard-dashboard-is-at-issue.html), which wrote an article on Xerox vs Apple back in 1989, untarnished, in its raw form. Your 'source' was cherry picking data.
Here is one excerpt.
Then Apple CEO John Sculley stated:
^^ thats a GLARING admission, by the CEO of Apple, don't you think? Nevertheless, Xerox ended up losing that lawsuit, with some saying that by the time they filed that lawsuit it was too late. The lawsuit wasn't thrown out because they didn't have a strong case against Apple, but because of how the lawsuit was presented as is at the time.
I'm not saying that Apple stole IP from Xerox, but what I am saying is that its quite disappointing to see Apple fanboys trying to distort the past into making it seem as though Apple created the first GUI, when that is CLEARLY not the case. The GUI had its roots in Xerox PARC. That, is a FACT.
http://upload.wikimedia.org/wikipedia/en/7/78/Rank_Xerox_8010%2B40_brochure_front.jpg
Actually, you're WRONG!!!! to say he's wrong. You're trying to say that every GUI element was created at Xerox? EVERY one of them? Sorry, but your argument here is akin to something Fox News would air.
DStaal
Jul 20, 09:10 AM
Where you are going to see the difference is when you multi-task.
For Example: Burn a Blueray disk, render a FinalCut Pro movie, download your digital camera RAW files into Adobe Lightroom and run a batch, and watch your favorite movie from the iTunes Movie Store all without a single hiccup.
Bingo. Check how many processes are running on your computer right now, and you'll see why more cores can help. Writing a program to use multiple CPUs is complicated, yes, but OS X is already written to spread programs across multiple CPUs automatically.
It will take a while for people to come up with effective uses for that, but given the power we will find it.
For Example: Burn a Blueray disk, render a FinalCut Pro movie, download your digital camera RAW files into Adobe Lightroom and run a batch, and watch your favorite movie from the iTunes Movie Store all without a single hiccup.
Bingo. Check how many processes are running on your computer right now, and you'll see why more cores can help. Writing a program to use multiple CPUs is complicated, yes, but OS X is already written to spread programs across multiple CPUs automatically.
It will take a while for people to come up with effective uses for that, but given the power we will find it.
Leoff
Sep 19, 07:28 AM
Sorry but I've heard this so many times it gets pretty annoying. Dont assume to know what ppl want to use their Macbooks for. I want to use it for music production which can be very intensive on the processor, other people for graphics etc where a few seconds shaved off processing times when added up many times can make quite a difference to productivity.
Also, when the new chips come out it will instantly knock a chunk off the resell value - yes this is always the way with technology but buying when an update is coming soon seems silly.
It gets annoying. Why? Because it's true and most people don't want to admit it.
In a few cases here and there, the extra processor power/speed is going to help. But for a majority of people buying a MacBook, they're not going to be burning home-made DVD's, doing intense Music compositions, or using it for hard-core gaming. They're going to SURF and WRITE.
As for the "resale" value, again, most people who are buying a used MacBook are NOT going to ask "is it a Merom?" They're going to ask how nice the case is, how much use it's gotten, and how much it is, and that's it.
Everybody likes to play "ooo, I'm the hard-core computing whiz and I need the BEST out there", but I bet you if you took an honest poll out there of everyone who's answered this thread, you'd find at least 75% these Apple fans have no need for for the extra speed, they just want it because it's "cool" and "fast" and it's the latest thing out there.
Also, when the new chips come out it will instantly knock a chunk off the resell value - yes this is always the way with technology but buying when an update is coming soon seems silly.
It gets annoying. Why? Because it's true and most people don't want to admit it.
In a few cases here and there, the extra processor power/speed is going to help. But for a majority of people buying a MacBook, they're not going to be burning home-made DVD's, doing intense Music compositions, or using it for hard-core gaming. They're going to SURF and WRITE.
As for the "resale" value, again, most people who are buying a used MacBook are NOT going to ask "is it a Merom?" They're going to ask how nice the case is, how much use it's gotten, and how much it is, and that's it.
Everybody likes to play "ooo, I'm the hard-core computing whiz and I need the BEST out there", but I bet you if you took an honest poll out there of everyone who's answered this thread, you'd find at least 75% these Apple fans have no need for for the extra speed, they just want it because it's "cool" and "fast" and it's the latest thing out there.
Lord Blackadder
Mar 23, 12:50 AM
I initially supported the Iraq invasion. I believed the Bush Administration's case for WMD's - in particular I was swayed by Colin Powell's presentation to the UN. I believed then, as I do now, that Saddam Hussein's government was arbitrary, cruel, corrupt.
Looking back, it should have been obvious to me that there were a huge number of potential pitfalls - lack of support from Iraqis (and to a lesser extent the international community through the UN) being the most critical. While the initial invasion was predictably successful, the entire issue of post-Saddam Iraq had been poorly thought out - to the extent that it was thought out at all. The result is a tragic disaster of truly epic proportions.
Still, even with this tragedy fresh in our minds (and indeed ongoing along with the war in Afghanistan), I find it impossible to look at the Libyan situation and say "we should not intervene". There is much I do not like about how the my country behaves on the international stage, but in this affair I feel that non-intervention is unconscionable.
Looking back, it should have been obvious to me that there were a huge number of potential pitfalls - lack of support from Iraqis (and to a lesser extent the international community through the UN) being the most critical. While the initial invasion was predictably successful, the entire issue of post-Saddam Iraq had been poorly thought out - to the extent that it was thought out at all. The result is a tragic disaster of truly epic proportions.
Still, even with this tragedy fresh in our minds (and indeed ongoing along with the war in Afghanistan), I find it impossible to look at the Libyan situation and say "we should not intervene". There is much I do not like about how the my country behaves on the international stage, but in this affair I feel that non-intervention is unconscionable.
Trekkie
Sep 13, 05:42 PM
According to tha Anandtech article its likely that the Clovertown family will be clocked slower then the Woodcrests
clock speed isn't everything. workload dependant of course.
clock speed isn't everything. workload dependant of course.
milo
Jul 27, 03:39 PM
It's always a little alarming when a post starts "sorry if I missed it but..."
This is a positively thoughtless remark. No one's cheering the MHz myth on, in fact, Intel itself has abandoned the concept. Until the 3Ghz woodies get dropped in a MacPro, the 2.7 GHZ G5 will still be the fastest chip ever put in a Macintosh. I have a dual core Pentium D in a bastard Mac at the house, it runs at 3.8 GHz. I'm pretty sure that even it is slower in a lot of areas than these Core 2's. So no, you're absolutely wrong, the MHz myth is all but dead.
The 2.7 G5 will be the highest clocked chip in a mac for a while, but probably not the fastest. In a number of benchmarks, Yonah has already beaten dual G5's, the conroes and woodrests will likely widen the gap even more.
This is a positively thoughtless remark. No one's cheering the MHz myth on, in fact, Intel itself has abandoned the concept. Until the 3Ghz woodies get dropped in a MacPro, the 2.7 GHZ G5 will still be the fastest chip ever put in a Macintosh. I have a dual core Pentium D in a bastard Mac at the house, it runs at 3.8 GHz. I'm pretty sure that even it is slower in a lot of areas than these Core 2's. So no, you're absolutely wrong, the MHz myth is all but dead.
The 2.7 G5 will be the highest clocked chip in a mac for a while, but probably not the fastest. In a number of benchmarks, Yonah has already beaten dual G5's, the conroes and woodrests will likely widen the gap even more.
Chundles
Jul 20, 11:46 AM
Sorry I don't see that happening... Apple has basically always given developers a few months (to several months) lead time with the next major version of Mac OS X. That has taken place yet... so I don't see it being released at WWDC 2006.
He was referring to my post in which I was referring to MWSF '07, not the WWDC.
I still don't think we'll se a full release at MWSF but I think the date will be announced.
He was referring to my post in which I was referring to MWSF '07, not the WWDC.
I still don't think we'll se a full release at MWSF but I think the date will be announced.
NJRonbo
Jun 14, 11:40 AM
BTW...
Quick question...
How does Radio Shack know what your upgrade
price will be?
I mean, I know already I am not eligible for a
discount and will have to pay $399 or $499.
Does Radio Shack have access to your AT&T
account to determine your upgrade price?
Quick question...
How does Radio Shack know what your upgrade
price will be?
I mean, I know already I am not eligible for a
discount and will have to pay $399 or $499.
Does Radio Shack have access to your AT&T
account to determine your upgrade price?
Blue Velvet
Apr 27, 02:43 PM
Are you calling me a liar? I literally went to WhiteHouse.gov, opened the file in Illustrator, and moved the text around myself. :rolleyes:
You said you opened the file in Indesign which is what sparked my interest, because that's something you can't technically do. We've already established long ago that you're untrustworthy, so it's fair to be suspicious.
Some things never change. Laughably bias.
You're so cute when you're whining. :)
are there any graphic designers here who can help?
Sure there are. Been designing since before you were born. This file does not have layers. It has objects within one group. A document created in 1961 will have been scanned, possible inadvertently split into sections as it's not even a linked group or even a compound path. MattSepata is correct to some extent, but I doubt it's been OCRed. Just a crappily-made PDF... which hasn't even been security-locked.
Nice try, but no cookie, Sherlock.
You said you opened the file in Indesign which is what sparked my interest, because that's something you can't technically do. We've already established long ago that you're untrustworthy, so it's fair to be suspicious.
Some things never change. Laughably bias.
You're so cute when you're whining. :)
are there any graphic designers here who can help?
Sure there are. Been designing since before you were born. This file does not have layers. It has objects within one group. A document created in 1961 will have been scanned, possible inadvertently split into sections as it's not even a linked group or even a compound path. MattSepata is correct to some extent, but I doubt it's been OCRed. Just a crappily-made PDF... which hasn't even been security-locked.
Nice try, but no cookie, Sherlock.
Michael383
Apr 8, 04:11 AM
Many Best Buys with Apple Shoppes have Apple representatives who work right at the store, I doubt they would let this happen at their store. I wonder how many Best Buys have done this
The Best Buy I bought my MBP at was in an Apple Shop and had a great representitive in it. Dan was great and could not have been more helpful. I hope the first time I visit a Apple store I have a similar experience.
The Best Buy I bought my MBP at was in an Apple Shop and had a great representitive in it. Dan was great and could not have been more helpful. I hope the first time I visit a Apple store I have a similar experience.
unlinked
Apr 6, 04:51 PM
BTW... the Xoom at the Best Buy here is broken... been that way for two weeks now according to the sales guy.
If the sales are so bad why don't they just replace it from the stock they have?
If the sales are so bad why don't they just replace it from the stock they have?
macduke
Mar 25, 10:51 PM
So is there real resolution independence or just a x2 mode?
This. Until this happens displays won't advance any further for actual computers (non-tablet) because there are so many form factors.
Apple can spend the time to make graphics for each flavor of iPhone or iPad because there aren't that many to deal with. It becomes a lot more difficult to do this across a large range of products. Besides, computers are getting to the point where they are too powerful for most users (hence the popularity of the iPad). A retina display option would give people more incentive to upgrade their desktops, laptops, etc. I think?
As a designer, I'd love a retina 27" ACD. 300dpi right on my screen, almost perfect. Now if we could just get the color/brightness a little more accurate...
This. Until this happens displays won't advance any further for actual computers (non-tablet) because there are so many form factors.
Apple can spend the time to make graphics for each flavor of iPhone or iPad because there aren't that many to deal with. It becomes a lot more difficult to do this across a large range of products. Besides, computers are getting to the point where they are too powerful for most users (hence the popularity of the iPad). A retina display option would give people more incentive to upgrade their desktops, laptops, etc. I think?
As a designer, I'd love a retina 27" ACD. 300dpi right on my screen, almost perfect. Now if we could just get the color/brightness a little more accurate...