roadbloc
Apr 27, 08:49 AM
Oh well. It's not as if I get out much anyway...
http://i.imgur.com/SFDTG.jpg
http://i.imgur.com/SFDTG.jpg
LagunaSol
Apr 19, 10:43 PM
For that matter, people say that Apple ripped off their bookshelf from Delicious Library. Which itself took it from who knows where.
Probably from an actual bookshelf. ;)
I totally forgot about that! What a joke. Apple has become the king of hypocrites. And they copied the Apple logo from the Beatle's Apple Records.
Have you actually seen the Apple Records logo? Apparently not.
Probably from an actual bookshelf. ;)
I totally forgot about that! What a joke. Apple has become the king of hypocrites. And they copied the Apple logo from the Beatle's Apple Records.
Have you actually seen the Apple Records logo? Apparently not.
CavemanUK
Aug 6, 05:16 PM
So, you're comparing a mature product (Tiger) to one that's still in beta and which by all accounts has plenty of outstanding issues before it's ever released (Vista)?
Not the fairest of comparisons, is it? Perhaps we should compare the latest of the Leopard builds with the latest Vista build for a more valid comparison of the relative position of the two OSs?
"Beige, boring box". Have you seen some of the hideous case designs that PC companies come out with? Not beige and far from boring (in a bad way). Apple's industrial design and grasp of asthetics and ergonomics is light years ahead.
Its perfectly valid to compare Tiger to Vista. especially since vista (or longhorn) was announced way before tiger was even previewed. If we want to compare the final vista product with a product thats on a similar timeline we would probably have to wait till 10.6 ;)
Not the fairest of comparisons, is it? Perhaps we should compare the latest of the Leopard builds with the latest Vista build for a more valid comparison of the relative position of the two OSs?
"Beige, boring box". Have you seen some of the hideous case designs that PC companies come out with? Not beige and far from boring (in a bad way). Apple's industrial design and grasp of asthetics and ergonomics is light years ahead.
Its perfectly valid to compare Tiger to Vista. especially since vista (or longhorn) was announced way before tiger was even previewed. If we want to compare the final vista product with a product thats on a similar timeline we would probably have to wait till 10.6 ;)
DocNo
Apr 11, 10:06 AM
I still think tape cameras are the best in quality, but the practicality of recording on a card or a hard drive will soon beat that.
I think Apple's timing with tomorrow is perfect for them to capitalize on this. If you watched the first two clips, the panelists talked about the lack of real standards for data and more importantly meta-data for file based workflows. They also referenced the only factory in the world that produces the most commonly used tape in pro workflow as being wiped out by the Tsunami in Japan - if Apple follows up with a new standard for file based workflow (which I fully expect them to do - skating to where the puck will be - it's a no brainer) and with Thunderbolt and a few manufacturers ready to capitalize it, I think you could see a dramatic shift in workflow since the tape situation will get dire for many. As one of the panelists pointed out, people aren't going to stop creating content just because they can't get more tape.
This might be the external catalyst that causes a dramatic shift. They are rare, but they do happen and events certainly seem to be lining up!
(I can't wait for the eventual conspiracy theorists that will no doubt claim SJ engineered the Tsunami in order to take advantage of it :rolleyes: )
I think Apple's timing with tomorrow is perfect for them to capitalize on this. If you watched the first two clips, the panelists talked about the lack of real standards for data and more importantly meta-data for file based workflows. They also referenced the only factory in the world that produces the most commonly used tape in pro workflow as being wiped out by the Tsunami in Japan - if Apple follows up with a new standard for file based workflow (which I fully expect them to do - skating to where the puck will be - it's a no brainer) and with Thunderbolt and a few manufacturers ready to capitalize it, I think you could see a dramatic shift in workflow since the tape situation will get dire for many. As one of the panelists pointed out, people aren't going to stop creating content just because they can't get more tape.
This might be the external catalyst that causes a dramatic shift. They are rare, but they do happen and events certainly seem to be lining up!
(I can't wait for the eventual conspiracy theorists that will no doubt claim SJ engineered the Tsunami in order to take advantage of it :rolleyes: )
ccrandall77
Aug 11, 01:59 PM
As I said before GSM has 81% of the market. UMTS (W-CDMA) enable hand-over back and forth UMTS and GSM. CDMA2000 can not do hand-over between GSM and CDMA2000. (See Wikipedia (http://en.wikipedia.org/wiki/W-CDMA): "The CDMA family of standards (including cdmaOne and CDMA2000) are not compatible with the W-CDMA family of standards that are based on ITU standards.")
Hence all networks that has GSM will transfer to UMTS since this decrases their initial investment as they transfer from 2/2.5G to 3G. Changing network standad is expensive, but the GSM/EDGE marketshare has been growing in US and will most likely continue to grow. At the same time CDMA is non-existant in europe.
The conclusion is simple - CDMA2000 is in the long run as dead as betamax.
If long run is 10yrs, I'll grant you that. But in the US and much of Asia (Australia maybe) where there's CDMA carriers, CDMA2000 1x-EVDx is going to be around for a while.
Actually WCDMA also inherits much of it's tech from CDMA/IS-95 and I have seen some documentation that shows that WCDMA can be compatible with CDMA2000 just like UTMS/WCDMA is compatible with GSM. But it sounds as if the upgrade path for GSM/GPRS/EDGE to WCDMA is easier than going from CDMA2000 1x to WCDMA.
But since for the next several years CDMA2000 1x-EVDO will be better than the GSM related technologies. And by the time WCDMA takes over, the iPhone will be as antiquated as the Newton.
Apple needs to create both versions as CDMA has about 5x% of the US market... and Apple has and probably will continue to cater to the US market first.
Hence all networks that has GSM will transfer to UMTS since this decrases their initial investment as they transfer from 2/2.5G to 3G. Changing network standad is expensive, but the GSM/EDGE marketshare has been growing in US and will most likely continue to grow. At the same time CDMA is non-existant in europe.
The conclusion is simple - CDMA2000 is in the long run as dead as betamax.
If long run is 10yrs, I'll grant you that. But in the US and much of Asia (Australia maybe) where there's CDMA carriers, CDMA2000 1x-EVDx is going to be around for a while.
Actually WCDMA also inherits much of it's tech from CDMA/IS-95 and I have seen some documentation that shows that WCDMA can be compatible with CDMA2000 just like UTMS/WCDMA is compatible with GSM. But it sounds as if the upgrade path for GSM/GPRS/EDGE to WCDMA is easier than going from CDMA2000 1x to WCDMA.
But since for the next several years CDMA2000 1x-EVDO will be better than the GSM related technologies. And by the time WCDMA takes over, the iPhone will be as antiquated as the Newton.
Apple needs to create both versions as CDMA has about 5x% of the US market... and Apple has and probably will continue to cater to the US market first.
notabadname
Mar 31, 06:35 PM
What a concept, Apple should consider this concept, for a more consistent and stable OS . . .
oh, they do
oh, they do
alex2792
Mar 25, 11:41 PM
Apple better integrate the airdrop functionality into iOS 5 as well.
Popeye206
Apr 25, 02:06 PM
UGH! That didn't take long before the sharks swarmed!
How ridiculous. :rolleyes:
How ridiculous. :rolleyes:
nagromme
Aug 7, 04:16 PM
Will Time Machine mean that you can't permanently delete any file? What about something confidential which you want to "e-shred"?
Never fear: it says you can exclude any data from Time Machine that you wish.
(Plus, if you change your mind about a file, you have OS X's Secure Empty Trash--which might also purge your backup, if it's connected. But whatever the implementation, I'm sure Apple has thought about this--we just don't have 100% off the details yet.)
My main concern overall about Leopard is that feature creep is going to cut into ease of use.
Only if you are FORCED to use Spaces. But like Expos�, Dashboard, and even the right mouse button, it will be optional. Apple is sensitive about beginner simplicity.
Never fear: it says you can exclude any data from Time Machine that you wish.
(Plus, if you change your mind about a file, you have OS X's Secure Empty Trash--which might also purge your backup, if it's connected. But whatever the implementation, I'm sure Apple has thought about this--we just don't have 100% off the details yet.)
My main concern overall about Leopard is that feature creep is going to cut into ease of use.
Only if you are FORCED to use Spaces. But like Expos�, Dashboard, and even the right mouse button, it will be optional. Apple is sensitive about beginner simplicity.
SiliconAddict
Aug 6, 03:27 AM
This kind of thinking is truly lame, just buy a Dell and go for penis enlargement surgury with the money you saved. No one will know the difference.
Not lame. Childish. I mean seriously. Is your (Generic your.) MBP any slower the day after they announce Core 2 MBPs? I swear to god it's almost as if people's lives are so incomplete that they need to feel special by having the top of the dog pile hardware. I received my MBP on Feb 21st at 10:30AM. Apple can do whatever they want. I'll still be enjoying my Mac at the same level I did on the 21st.
Not lame. Childish. I mean seriously. Is your (Generic your.) MBP any slower the day after they announce Core 2 MBPs? I swear to god it's almost as if people's lives are so incomplete that they need to feel special by having the top of the dog pile hardware. I received my MBP on Feb 21st at 10:30AM. Apple can do whatever they want. I'll still be enjoying my Mac at the same level I did on the 21st.
Arcus
Apr 25, 04:26 PM
TThis is so incredibly stupid, it's mind-numbing. All because a couple whistle-blowers decided to point out the obvious, to detract from Apple's quarterly sales and earnings announcement. Anyway, the lawsuit is completely flawed. I'm all for privacy, I love privacy. I'm an iOS developer and I know about the location tracking in iOS. Not that big of deal, in fact, if you answer "no" to the prompts when the phone asks if it's OK to use your current location, then nothing is sample, tracked or stored.
Luckily I got your post before you deleted it. On the:
Not that big of deal, in fact, if you answer "no" to the prompts when the phone asks if it's OK to use your current location, then nothing is sampled, tracked or stored.
That is so wrong I doubt you are even a developer.
Luckily I got your post before you deleted it. On the:
Not that big of deal, in fact, if you answer "no" to the prompts when the phone asks if it's OK to use your current location, then nothing is sampled, tracked or stored.
That is so wrong I doubt you are even a developer.
bushido
Mar 26, 09:29 AM
Zooming on Safari is pretty nice too, not as nice as the iPad's scrolling, but still nice.
.
i HATE the new zooming, drives my nuts and works "best" using a finger of both hands.
and i never understood spaces, can't figure out what it does lol
.
i HATE the new zooming, drives my nuts and works "best" using a finger of both hands.
and i never understood spaces, can't figure out what it does lol
bagelche
Apr 5, 09:36 PM
Heh. looks like foidulus had a similar idea. I missed that post. And MattInOz comes in with a reasonable rebuttal and more technical knowledge than I have.
I don't think either foidulus or I were saying they were completely siloed--I'm sure they had some level of access to the A/V code. The question is is it in SL. Possibly.
I don't think either foidulus or I were saying they were completely siloed--I'm sure they had some level of access to the A/V code. The question is is it in SL. Possibly.
iEvolution
Apr 19, 06:51 PM
So when is apple going to sue over the letter "i"?
Or how about suing companies for using certain shapes?
This kind of garbage just makes them look petty, just like the youtube videos demonstrating other phone antenna problems.
Or how about suing companies for using certain shapes?
This kind of garbage just makes them look petty, just like the youtube videos demonstrating other phone antenna problems.
shamino
Jul 21, 10:07 AM
With all these new technologies with 4, 8 and eventually 24-core capacities (some time in the not too distant future) all running at 64-bit, we musn't forget that software also has tobe developed for these machienes in order to get the most out of the hardware. At the moment we aren't even maximising core-duo, let alone a quad core and all the rest!!!!
It really depends on your application.
On the desktop, if you're a typical user that's just interested in web surfing, playing music files, organizing your photo collection, etc., more than two cores will probably not be too useful. For these kinds of users, even two cores may be overkill, but two are useful for keeping a responsive UI when an application starts hogging all the CPU time.
If you start using higher-power applications (like video work - iMovie/iDVD, for instance) then more cores will speed up that kind of work (assuming the app is properly multithreaded, of course.) 4-core systems will definitely benefit this kind of user.
With current applications, however, I don't think more than 4 cores will be useful. The kind of work that will make 8 cores useful is the kinds that requires expensive professional software - which most people don't use.
If you get away from the desktop and look to the server market, however, the picture changes. A web server may only be running one copy of Apache, but it may create a thread for every simultaneous connection. If you have 8 cores, then you can handle 8 times as many connections as a 1-core system can (assuming sufficient memory and I/O bandwidth, of course.) Ditto for database, transaction, and all kinds of other servers. More cores means more simultaneous connections without performance degradation.
Cluster computing has similar benefits. With 8 cores in each processor, it is almost as good as having 8 times as many computers in the cluster, and a lot less expensive. This concept will scale up as the number of cores increases, assuming motherbaords can be designed with enough memory and FSB bandwidth to keep them all busy.
I think we might see a single quad-core chip in consumer systems, like the iMac. I think it is likely that we'll see them in Pro systems, like the Mac Pro (including a high-end model with two quad-core chips.)
I think processors with more than 4 cores will never be seen outside of servers - Xserves and maybe some configurations of Mac Pro. Mostly because that's where there is a need for this kind of power.
It really depends on your application.
On the desktop, if you're a typical user that's just interested in web surfing, playing music files, organizing your photo collection, etc., more than two cores will probably not be too useful. For these kinds of users, even two cores may be overkill, but two are useful for keeping a responsive UI when an application starts hogging all the CPU time.
If you start using higher-power applications (like video work - iMovie/iDVD, for instance) then more cores will speed up that kind of work (assuming the app is properly multithreaded, of course.) 4-core systems will definitely benefit this kind of user.
With current applications, however, I don't think more than 4 cores will be useful. The kind of work that will make 8 cores useful is the kinds that requires expensive professional software - which most people don't use.
If you get away from the desktop and look to the server market, however, the picture changes. A web server may only be running one copy of Apache, but it may create a thread for every simultaneous connection. If you have 8 cores, then you can handle 8 times as many connections as a 1-core system can (assuming sufficient memory and I/O bandwidth, of course.) Ditto for database, transaction, and all kinds of other servers. More cores means more simultaneous connections without performance degradation.
Cluster computing has similar benefits. With 8 cores in each processor, it is almost as good as having 8 times as many computers in the cluster, and a lot less expensive. This concept will scale up as the number of cores increases, assuming motherbaords can be designed with enough memory and FSB bandwidth to keep them all busy.
I think we might see a single quad-core chip in consumer systems, like the iMac. I think it is likely that we'll see them in Pro systems, like the Mac Pro (including a high-end model with two quad-core chips.)
I think processors with more than 4 cores will never be seen outside of servers - Xserves and maybe some configurations of Mac Pro. Mostly because that's where there is a need for this kind of power.
Zadillo
Aug 7, 09:34 PM
Safari appears to be brushed metal. Go here (http://www.apple.com/macosx/leopard/dashboard.html) and go to about 1/6 of the way through.
Perhaps sometime between now and Spring 2007 they might find the time to change that.
Perhaps sometime between now and Spring 2007 they might find the time to change that.
SevenInchScrew
Dec 8, 12:54 PM
^^^ Again, from Sony and referenced in my post 152 (http://forums.macrumors.com/showpost.php?p=11513752&postcount=152)
I'm not arguing, just pointing out what Sony themselves have to say on the subject. Of course, as you progress further into the game, you are going to use more Premium racing models.
I understand that Sony are saying damage is not "unlocked" at some point in the game. And that is correct. When you start the game, there is damage. You can head on crash at 120 into a wall, and your bumper will crumple a little, in a Premium car. But, do this same thing, in the same Premium car, at level 20 and 40 and you will see increasing levels of damage.
So yea, Sony is right that damage itself isn't "unlocked" at some point, but the higher degrees of damage ARE. And not just from using race cars, Premium cars as well. Just like how as you progress, the AI gets slightly less brain-dead, the cars will start to somehow damage more.
I'm not arguing, just pointing out what Sony themselves have to say on the subject. Of course, as you progress further into the game, you are going to use more Premium racing models.
I understand that Sony are saying damage is not "unlocked" at some point in the game. And that is correct. When you start the game, there is damage. You can head on crash at 120 into a wall, and your bumper will crumple a little, in a Premium car. But, do this same thing, in the same Premium car, at level 20 and 40 and you will see increasing levels of damage.
So yea, Sony is right that damage itself isn't "unlocked" at some point, but the higher degrees of damage ARE. And not just from using race cars, Premium cars as well. Just like how as you progress, the AI gets slightly less brain-dead, the cars will start to somehow damage more.
Burnsey
Mar 19, 12:59 PM
When will you people realize that Obama is not in charge? You're not in charge either. Corporate interest rules the USA, Libya has 2% of the world's oil supply and a lot of companies have interests there. No one intervened militarily in Rwanda or East Timor. You guys can continue to have your little left vs right, conservative vs. liberal distraction of a debate, meanwhile the real people running the show don't give a rat's ass about any of it.
jwhitnah
Aug 8, 12:34 AM
anyone else a little underwhelmed with today's WWDC? There isn't anything that really jumped out at me besides the Mac Pro.
Mac Pro looks very nice. Now I am sure they will uodate their LCDs, so I do not want one/two and Leopard is a very modest refinement. They should have had system restore like MS years ago. Not a compelling upgrade, but I will buy it. Sigh.
Mac Pro looks very nice. Now I am sure they will uodate their LCDs, so I do not want one/two and Leopard is a very modest refinement. They should have had system restore like MS years ago. Not a compelling upgrade, but I will buy it. Sigh.
faroZ06
Apr 8, 12:34 AM
I am confused about this. Did Best Buy get iPads but tell customers that they don't have them? So now Apple pulled the iPads from the shelves, but there weren't any on the shelves...
Why would they do that :confused:?
Why would they do that :confused:?
AppleKrate
Sep 19, 05:29 AM
Why do you even visit this site? You are doing nothing but criticising Apple and their products. Please leave.
Ps. If I was Admin I would ban you :p
You guys crack me UP! Peace and love, they're only machines (ah, but what machines...) :)
Ps. If I was Admin I would ban you :p
You guys crack me UP! Peace and love, they're only machines (ah, but what machines...) :)
ncook06
Sep 13, 09:50 AM
I'm just wondering if I can drop one of these into an iMac... Are they pin-compatible? Also sort of wondering about a heat issue.
magbarn
Apr 9, 09:23 AM
Intel did indeed force Apple to use their IGP by not licensing other vendors to provide IGPs. The reason the MBP 13" and MBA 13" use IGPs and not dedicated GPU is one of space. Apple can't magically conjure up space on the logic board.
I push the GPU more often than I push the CPU on my MBA. I doubt I'm in the minority, though I'm probably part of the minority that actual knows this little fact. ;)
No matter how much you try to spin this, Intel got greedy on this one and couldn't back their greed with competence. They have sucked at GPUs since they have been in the GPU game (Intel i740 anyone ?).
I don't think 2IS is getting that IF Intel allowed Nvidia to continue making sandy bridge chipsets, Nvidia could've easily integrated a 320m successor into the south bridge. This would give you the best of both worlds, the downclocked Low-voltage Intel HD graphics when on battery or basic surfing, or the 320m successor in the south bridge when playing games or aperture photo editing. All this WITHOUT raising the motherboard chip count that putting a separate discrete (on it's own, not integrated into the chipset like 320m) would entail.
I push the GPU more often than I push the CPU on my MBA. I doubt I'm in the minority, though I'm probably part of the minority that actual knows this little fact. ;)
No matter how much you try to spin this, Intel got greedy on this one and couldn't back their greed with competence. They have sucked at GPUs since they have been in the GPU game (Intel i740 anyone ?).
I don't think 2IS is getting that IF Intel allowed Nvidia to continue making sandy bridge chipsets, Nvidia could've easily integrated a 320m successor into the south bridge. This would give you the best of both worlds, the downclocked Low-voltage Intel HD graphics when on battery or basic surfing, or the 320m successor in the south bridge when playing games or aperture photo editing. All this WITHOUT raising the motherboard chip count that putting a separate discrete (on it's own, not integrated into the chipset like 320m) would entail.
Thunderhawks
Apr 6, 02:25 PM
Motorola not selling any units of a crappy product? Huh... who'd have thought.
No need to brag IMO and did you really try a Xoom and put it through it's paces?
I didn't, but tried an ipad 1 and it wasn't doing all I would want it for, plus I never buy a first gen Apple product. (That little rule has served me well since 1984)
Apple is waaayyyyyyy ahead at the moment and the copy cats are playing catch up.
But, I like that there will be a race forcing each manufacturer to make the product better and better.
While Apple is not ignoring what the competition does, their philosophy of making their own products better and better seems to be successful.
So, why change that formula or shake in your boots , just because somebody launches a new copy?
No need to brag IMO and did you really try a Xoom and put it through it's paces?
I didn't, but tried an ipad 1 and it wasn't doing all I would want it for, plus I never buy a first gen Apple product. (That little rule has served me well since 1984)
Apple is waaayyyyyyy ahead at the moment and the copy cats are playing catch up.
But, I like that there will be a race forcing each manufacturer to make the product better and better.
While Apple is not ignoring what the competition does, their philosophy of making their own products better and better seems to be successful.
So, why change that formula or shake in your boots , just because somebody launches a new copy?