Huntn
Apr 23, 10:17 PM
It would still provide evidence for the individual concerned, right? It may have no bearing on the reality of our existence, but our existence doesn't matter. It's their existence that matters. Faith, true faith, involves a lot of introspection.
There's concrete reality and abstract reality, the world of the Forms if you like. It's in abstract reality that physical principles are proven, yet we couldn't see or feel them otherwise in the concrete world.
Thus, if the person has an epiphany, and then reflects on what just occurred logically, it could still be called proof.
When I think of 'proof' I think of something that meets a logical standard for a large group of people. Individual proof that no one else sees is questionable, more suited to be calling faith. By your reasoning a Theist and an Atheist could both claim proof based on what they imagine, but they would each claim the other is wrong. In this matter there no such thing as proof.
On a separate note, even if a giant face appeared in the sky and said "I am God!" how would we prove this is a deity or an advanced alien species? I suppose this could be an argument for the individuality of faith, but still it's not what any logical person would call real proof. If it is something you sense, there is no guarantee your senses are accurate. And then what about the person who sees pink dragons? Reality might really be illusive. ;)
There's concrete reality and abstract reality, the world of the Forms if you like. It's in abstract reality that physical principles are proven, yet we couldn't see or feel them otherwise in the concrete world.
Thus, if the person has an epiphany, and then reflects on what just occurred logically, it could still be called proof.
When I think of 'proof' I think of something that meets a logical standard for a large group of people. Individual proof that no one else sees is questionable, more suited to be calling faith. By your reasoning a Theist and an Atheist could both claim proof based on what they imagine, but they would each claim the other is wrong. In this matter there no such thing as proof.
On a separate note, even if a giant face appeared in the sky and said "I am God!" how would we prove this is a deity or an advanced alien species? I suppose this could be an argument for the individuality of faith, but still it's not what any logical person would call real proof. If it is something you sense, there is no guarantee your senses are accurate. And then what about the person who sees pink dragons? Reality might really be illusive. ;)
Liquorpuki
Oct 7, 06:44 PM
And how does carrier matter at all in your argument. Sorry but that entire augment there has no meaning in this debate.
You were arguing in your little list that having to jailbreak their iphone is gonna make users want to migrate to Android phones. Jailbreaking is basically hacking and phones are hacked because functionality is crippled. I'm pointing out that Android phones can have the same problem, especially if they come out on carriers such as Verizon, which goes further and also cripples hw features iPhone users take for granted.
The iPhone platform has some significant variations. Location precision (lack of GPS), microphone or speaker existence on the touch, existence of MMS, CPU speed between models, amount of RAM (a potentially big problem for game makers).
The context isn't how many variables exist but how many variables devs have to deal with. iPhone app developers have to deal with much less than developers on decentralized hardware platforms. WM developers have several different OEM's to deal with as well as all their models and generations thereof. If you can't see how the complexity translates into a harder development process, I don't know what to tell you.
Really. Do you have an example of an app bricking a WM phone
I had a couple apps brick my i730 back when I was on Verizon. I ended up having to hard reset and resync all my contacts.
Verizon doesn't cripple their smartphones. Even their GPS is unlocked now
the folks at the Verizon forums disagree with you
So you admit that it's hobbled in its stock form? ATT / Verizon / Sprint don't block any apps you want to use on their smartphones. Or themes. Or anyt
First most phones I've seen are hobbled in its stock form, not just the iPhone. But personally I think the quality of the iPhone and all the other things the design engineers got right outweighs the fact I have to jailbreak it to put a 5x5 matrix of icons on my screen out the box.
I hate AT&T service here in LA and I hate the fact I can't tether but I put up with it because it's such a good phone. I don't care that Android or Sprint doesn't screen apps because to take advantage of that, at this point in time I'd have to downgrade to a shttier phone and go to an app store that has less than 25% of the apps Apple does, and ironically, because they don't screen, more of them suck
The iPhone's Bluetooth was crippled to begin with... and still is. The original iPhone will always lack GPS
Crippled means the hw is functional but was disabled by the carrier or MFGer. An iPhone that wasn't designed with a GPS chip is not crippled. An iPhone having a fullly functional GPS chip that won't work without purchasing Telenav is crippled.
You were arguing in your little list that having to jailbreak their iphone is gonna make users want to migrate to Android phones. Jailbreaking is basically hacking and phones are hacked because functionality is crippled. I'm pointing out that Android phones can have the same problem, especially if they come out on carriers such as Verizon, which goes further and also cripples hw features iPhone users take for granted.
The iPhone platform has some significant variations. Location precision (lack of GPS), microphone or speaker existence on the touch, existence of MMS, CPU speed between models, amount of RAM (a potentially big problem for game makers).
The context isn't how many variables exist but how many variables devs have to deal with. iPhone app developers have to deal with much less than developers on decentralized hardware platforms. WM developers have several different OEM's to deal with as well as all their models and generations thereof. If you can't see how the complexity translates into a harder development process, I don't know what to tell you.
Really. Do you have an example of an app bricking a WM phone
I had a couple apps brick my i730 back when I was on Verizon. I ended up having to hard reset and resync all my contacts.
Verizon doesn't cripple their smartphones. Even their GPS is unlocked now
the folks at the Verizon forums disagree with you
So you admit that it's hobbled in its stock form? ATT / Verizon / Sprint don't block any apps you want to use on their smartphones. Or themes. Or anyt
First most phones I've seen are hobbled in its stock form, not just the iPhone. But personally I think the quality of the iPhone and all the other things the design engineers got right outweighs the fact I have to jailbreak it to put a 5x5 matrix of icons on my screen out the box.
I hate AT&T service here in LA and I hate the fact I can't tether but I put up with it because it's such a good phone. I don't care that Android or Sprint doesn't screen apps because to take advantage of that, at this point in time I'd have to downgrade to a shttier phone and go to an app store that has less than 25% of the apps Apple does, and ironically, because they don't screen, more of them suck
The iPhone's Bluetooth was crippled to begin with... and still is. The original iPhone will always lack GPS
Crippled means the hw is functional but was disabled by the carrier or MFGer. An iPhone that wasn't designed with a GPS chip is not crippled. An iPhone having a fullly functional GPS chip that won't work without purchasing Telenav is crippled.
alex_ant
Oct 9, 08:08 PM
Originally posted by gopher
Maybe we have, but nobody has provided compelling evidence to the contrary.
You must be joking. Reference after reference has been provided and you simply break from the thread, only to re-emerge in another thread later. This has happened at least twice now that I can remember.
The Mac hardware is capable of 18 billion floating calculations a second. Whether the software takes advantage of it that's another issue entirely.
My arse is capable of making 8-pound turds, but whether or not I eat enough baked beans to take advantage of that is another issue entirely. In other words,
18 gigaflops = about as likely as an 8-pound turd in my toilet. Possible, yes (under the most severely ridiculous condtions). Real-world, no.
If someone is going to argue that Macs don't have good floating point performance, just look at the specs.
For the - what is this, fifth? - time now: AltiVec is incapable of double precision, and is capable of accelerating only that code which is written specifically to take advantage of it. Which is some of it. Which means any high "gigaflops" performance quotes deserve large asterisks next to them.
If they really want good performance and aren't getting it they need to contact their favorite developer to work with the specs and Apple's developer relations.
Exactly, this is the whole problem - if a developer wants good performance and can't get it, they have to jump through hoops and waste time and money that they shouldn't have to waste.
Apple provides the hardware, it is up to developer companies to utilize the hardware the best way they can. If they can't utilize Apple's hardware to its most efficient mode, then they should find better developers.
Way to encourage Mac development, huh? "Hey guys, come develop for our platform! We've got a 3.5% national desktop market share and a 2% world desktop market share, and we have an uncertain future! We want YOU to spend time and money porting your software to OUR platform, and on top of that, we want YOU to go the extra mile to waste time and money that you shouldn't have to waste just to ensure that your code doesn't run like a dog on our ancient wack-job hack of a processor!"
If you are going to complain that Apple doesn't have good floating point performance, don't use a PC biased spec like Specfp.
"PC biased spec like SPECfp?" Yes, the reason PPC does so poorly in SPEC is because SPECfp is biased towards Intel, AMD, Sun, MIPS, HP/Compaq, and IBM (all of whose chips blow the G4 out of the water, and not only the x86 chips - the workstation and server chips too, literally ALL of them), and Apple's miserable performance is a conspiracy engineered by The Man, right?
Go by actual floating point calculations a second.
Why? FLOPS is as dumb a benchmark as MIPS. That's the reason cross-platform benchmarks exist.
Nobody has shown anything to say that PCs can do more floating point calculations a second. And until someone does I stand by my claim.
An Athlon 1700+ scores about what, 575 in SPECfp2000 (depending on the system)? Results for the 1.25GHz G4 are unavailable (because Apple is ashamed to publish them), but the 1GHz does about 175. Let's be very gracious and assume the new GCC has got the 1.25GHz G4 up to 300. That's STILL terrible. So how about an accurate summary of the G4's floating point performance:
On the whole, poor.******
* Very strong on applications well-suited to AltiVec and optimized to take advantage of it.
Beach wedding dress pattern
Designer Wedding Dress
On this wedding season our
The Wedding Dress, with
289 results found: Vogue V2979 Kate Middleton wedding dress sewing pattern · Vogue uncut sewing pattern DRESS, TUNIC amp; PANTS 8-10-12 · VINTAGE VOGUE
(Vintage wedding dresses from
Vogue recent and wedding dress
mermaid wedding dresses
wedding dress patterns for
informal wedding dress
Acquachiara Wedding Dress
designer wedding dress
wedding dress gown
trendy sexy baby bridal wedding gownodice/appliqued patterns/lace-up/long trace/different colours available+free shipping. US$ 148.42 - US$ 169.47/piece
The patterns of sheer fabric
Yuna wedding dress wallpaper
Maybe we have, but nobody has provided compelling evidence to the contrary.
You must be joking. Reference after reference has been provided and you simply break from the thread, only to re-emerge in another thread later. This has happened at least twice now that I can remember.
The Mac hardware is capable of 18 billion floating calculations a second. Whether the software takes advantage of it that's another issue entirely.
My arse is capable of making 8-pound turds, but whether or not I eat enough baked beans to take advantage of that is another issue entirely. In other words,
18 gigaflops = about as likely as an 8-pound turd in my toilet. Possible, yes (under the most severely ridiculous condtions). Real-world, no.
If someone is going to argue that Macs don't have good floating point performance, just look at the specs.
For the - what is this, fifth? - time now: AltiVec is incapable of double precision, and is capable of accelerating only that code which is written specifically to take advantage of it. Which is some of it. Which means any high "gigaflops" performance quotes deserve large asterisks next to them.
If they really want good performance and aren't getting it they need to contact their favorite developer to work with the specs and Apple's developer relations.
Exactly, this is the whole problem - if a developer wants good performance and can't get it, they have to jump through hoops and waste time and money that they shouldn't have to waste.
Apple provides the hardware, it is up to developer companies to utilize the hardware the best way they can. If they can't utilize Apple's hardware to its most efficient mode, then they should find better developers.
Way to encourage Mac development, huh? "Hey guys, come develop for our platform! We've got a 3.5% national desktop market share and a 2% world desktop market share, and we have an uncertain future! We want YOU to spend time and money porting your software to OUR platform, and on top of that, we want YOU to go the extra mile to waste time and money that you shouldn't have to waste just to ensure that your code doesn't run like a dog on our ancient wack-job hack of a processor!"
If you are going to complain that Apple doesn't have good floating point performance, don't use a PC biased spec like Specfp.
"PC biased spec like SPECfp?" Yes, the reason PPC does so poorly in SPEC is because SPECfp is biased towards Intel, AMD, Sun, MIPS, HP/Compaq, and IBM (all of whose chips blow the G4 out of the water, and not only the x86 chips - the workstation and server chips too, literally ALL of them), and Apple's miserable performance is a conspiracy engineered by The Man, right?
Go by actual floating point calculations a second.
Why? FLOPS is as dumb a benchmark as MIPS. That's the reason cross-platform benchmarks exist.
Nobody has shown anything to say that PCs can do more floating point calculations a second. And until someone does I stand by my claim.
An Athlon 1700+ scores about what, 575 in SPECfp2000 (depending on the system)? Results for the 1.25GHz G4 are unavailable (because Apple is ashamed to publish them), but the 1GHz does about 175. Let's be very gracious and assume the new GCC has got the 1.25GHz G4 up to 300. That's STILL terrible. So how about an accurate summary of the G4's floating point performance:
On the whole, poor.******
* Very strong on applications well-suited to AltiVec and optimized to take advantage of it.
techwarrior
Nov 12, 12:14 PM
Add me to the happy list. I have had all iPhones since 3G, and rarely lose a call, one or two places I typically go have poor service so I let others know I will call back if I drop in these spots. MCell has done wonders for the poor service at my home.
ATT is the only service I can get at work. Due to my office being an R&D facility for a company that makes phone systems they block all external wireless signals and then put ATT repeaters in the building.
So, for me, it would take a lot to push me over the edge to move to another provider. I do like how others are pushing ATT to adopt with more competitive plan options and think competition from TMo, Sprint/Nextel and Vz can only be good for those of us who can stay with ATT.
ATT is the only service I can get at work. Due to my office being an R&D facility for a company that makes phone systems they block all external wireless signals and then put ATT repeaters in the building.
So, for me, it would take a lot to push me over the edge to move to another provider. I do like how others are pushing ATT to adopt with more competitive plan options and think competition from TMo, Sprint/Nextel and Vz can only be good for those of us who can stay with ATT.
Peterkro
Mar 12, 07:08 PM
Number three reactor at the same plant has cooling and containment issues hopefully they can get it under control.
NebulaClash
Apr 28, 08:48 AM
The tangible item is the smartphone hardware itself. Thats like saying the battle between Sony and Samsung LCD tv's, isnt exactly about tv's... its about Google TV(Sony) vs Samsung Smart TV.
Then why don't they show studies that compare Samsung versus LG versus Motorola smart phone hardware sales? Why are they constantly talking about the "Android" share?
Then why don't they show studies that compare Samsung versus LG versus Motorola smart phone hardware sales? Why are they constantly talking about the "Android" share?
UnixMac
Oct 9, 05:51 PM
Bottom line.......Macs are over priced....we just keep buying them and so why would the accountants want to change that gig?
UnixMac
Oct 8, 04:38 PM
Sadly the lack of a system bus faster than 133/167 and use of leading edge RAM technology is a major downside to Mac hardware. G4 with software optomized for it is still on par with P4, but when Altivec is not in the picture or MultiProcessor awareness, the Mac slips very fart behind. I still have faith that the G5 will make up for this gap.
As for OS X vs Windows 2000, I am not as technically aware as the above poster, however my own experience in a large office environment with heavy networking is that Windows 2000 has failed us. We are switching to Unix and Sun, because we can't afford the down time that windows 2000 is giving us, the cost advantage of windows not withstanding.
I have not come accross many large computer operations people that will tell me that Windows is a replacement for Unix. Not unless dealing with small size and limited budget.
As for OS X vs Windows 2000, I am not as technically aware as the above poster, however my own experience in a large office environment with heavy networking is that Windows 2000 has failed us. We are switching to Unix and Sun, because we can't afford the down time that windows 2000 is giving us, the cost advantage of windows not withstanding.
I have not come accross many large computer operations people that will tell me that Windows is a replacement for Unix. Not unless dealing with small size and limited budget.
Foxglove9
Aug 29, 11:13 AM
Eh, I believe little of what Greenpeace ever says. :rolleyes:
redkamel
Aug 29, 06:57 PM
3 The point is that I've never heard a satisfactory answer as to why water vapor isn't taken into effect when discussing global warming, when it is undeniably the largest factor of the greenhouse effect. ...
Forty years ago, cars released nearly 100 times more C02 than they do today, industry polluted the atmosphere while being completely unchecked, and deforestation went untamed. Thanks to grassroots movement in the 60s and 70s (and yes, Greenpeace), worldwide pollution has been cut dramatically, and C02 pollution has been cut even more thanks to the Kyoto Agreement. But global warming continues, despite human's dramatically decreased pollution of the atmosphere.
man I just had to post....the nerd in me...
Probably (no sarcasm) because most water vapor is naturally produced and can be recycled as rain, while greenhouse gasses usually stay in the atmosphere. CO2 can also be recycled, however it does not recycle itself as water vapor does, it requires another source to convert it to organic carbon.
While nature may produce 3x the CO2 as humans, I do not believe the level of CO2 produced by nature is increasing. Nature also has built in systems to use the CO2 it makes to capture energy, or to store the CO2 as carbon in fossil fuels or matter. Humans only produce CO2 by making energy for themselves to use, and their production is increasing, without a way to draw the CO2 they made back out. Therefore the increase in CO2 that will not be removed is the concern. There are also other chemicals, but CO2 is widely publicized because everyone knows what it is, too.
Its like if you have a storeroom people drop things off in and take things out of, but it happens at pretty much the same rate. Except there is just one guy who only drops stuff off. Eventually all his stuff will take up a noticeable space in the storeroom.
Increases in greenhouses gasses are not immedieatly felt. We are now feeling the effects of gasses from decades ago. Also, although you say 'worldwide pollution has decreased", even though I doubt it is true, you mean our RATE of poullution has decreased, not the total amount of pollution we have put in the air, which is still increasing. When we decrease the amount of net pollution produced by humans, then it is a good sign.
Also to everyone complaining about out environment being ruined, yet want GM crops to grow food to stop starvation...(disclaimer: I am not cold hearted, I am realistic). The problem we have on this planet, as many agree, is too much pollution. Pollution is caused by people. So if we have more people, we will have more pollution. More people=more pollution.
When a system's carrying capacity is reached, the population level declines until resources can recover, then it climbs again. But if you artificially raise the carrying capacity (as humans like to do), then the crash will be bigger....and the resources may not survive as they are deprived of the humans that run, control, and supply them.
Believe it or not, our planet was not designed to sustain 8 billion people. Finding ways to produce food efficiently is great...but it should be used for less resources= same amount of food, NOT same resources=more food. It IS too bad people have to starve. But using that efficiency to make more food for more people will only lead to more people wanting more food, and goods. Eventually it will not be able to be supplied...for some reason or other. And you will have a very, very large crash.
Though experiment: you put a bunch of fish in a small fish tank. Keep feeding them...they reproduce. Clean the water...feed them all, they reproduce. Eventually they waste faster than you clean, or you forget to clean one day...and they all die.
Forty years ago, cars released nearly 100 times more C02 than they do today, industry polluted the atmosphere while being completely unchecked, and deforestation went untamed. Thanks to grassroots movement in the 60s and 70s (and yes, Greenpeace), worldwide pollution has been cut dramatically, and C02 pollution has been cut even more thanks to the Kyoto Agreement. But global warming continues, despite human's dramatically decreased pollution of the atmosphere.
man I just had to post....the nerd in me...
Probably (no sarcasm) because most water vapor is naturally produced and can be recycled as rain, while greenhouse gasses usually stay in the atmosphere. CO2 can also be recycled, however it does not recycle itself as water vapor does, it requires another source to convert it to organic carbon.
While nature may produce 3x the CO2 as humans, I do not believe the level of CO2 produced by nature is increasing. Nature also has built in systems to use the CO2 it makes to capture energy, or to store the CO2 as carbon in fossil fuels or matter. Humans only produce CO2 by making energy for themselves to use, and their production is increasing, without a way to draw the CO2 they made back out. Therefore the increase in CO2 that will not be removed is the concern. There are also other chemicals, but CO2 is widely publicized because everyone knows what it is, too.
Its like if you have a storeroom people drop things off in and take things out of, but it happens at pretty much the same rate. Except there is just one guy who only drops stuff off. Eventually all his stuff will take up a noticeable space in the storeroom.
Increases in greenhouses gasses are not immedieatly felt. We are now feeling the effects of gasses from decades ago. Also, although you say 'worldwide pollution has decreased", even though I doubt it is true, you mean our RATE of poullution has decreased, not the total amount of pollution we have put in the air, which is still increasing. When we decrease the amount of net pollution produced by humans, then it is a good sign.
Also to everyone complaining about out environment being ruined, yet want GM crops to grow food to stop starvation...(disclaimer: I am not cold hearted, I am realistic). The problem we have on this planet, as many agree, is too much pollution. Pollution is caused by people. So if we have more people, we will have more pollution. More people=more pollution.
When a system's carrying capacity is reached, the population level declines until resources can recover, then it climbs again. But if you artificially raise the carrying capacity (as humans like to do), then the crash will be bigger....and the resources may not survive as they are deprived of the humans that run, control, and supply them.
Believe it or not, our planet was not designed to sustain 8 billion people. Finding ways to produce food efficiently is great...but it should be used for less resources= same amount of food, NOT same resources=more food. It IS too bad people have to starve. But using that efficiency to make more food for more people will only lead to more people wanting more food, and goods. Eventually it will not be able to be supplied...for some reason or other. And you will have a very, very large crash.
Though experiment: you put a bunch of fish in a small fish tank. Keep feeding them...they reproduce. Clean the water...feed them all, they reproduce. Eventually they waste faster than you clean, or you forget to clean one day...and they all die.
KnightWRX
May 2, 05:51 PM
Until Vista and Win 7, it was effectively impossible to run a Windows NT system as anything but Administrator. To the point that other than locked-down corporate sites where an IT Professional was required to install the Corporate Approved version of any software you need to do your job, I never knew anyone running XP (or 2k, or for that matter NT 3.x) who in a day-to-day fashion used a Standard user account.
Of course, I don't know of any Linux distribution that doesn't require root to install system wide software either. Kind of negates your point there...
In contrast, an "Administrator" account on OS X was in reality a limited user account, just with some system-level privileges like being able to install apps that other people could run. A "Standard" user account was far more usable on OS X than the equivalent on Windows, because "Standard" users could install software into their user sandbox, etc. Still, most people I know run OS X as Administrator.
You could do the same as far back as Windows NT 3.1 in 1993. The fact that most software vendors wrote their applications for the non-secure DOS based versions of Windows is moot, that is not a problem of the OS's security model, it is a problem of the Application. This is not "Unix security" being better, it's "Software vendors for Windows" being dumber.
It's no different than if instead of writing my preferences to $HOME/.myapp/ I'd write a software that required writing everything to /usr/share/myapp/username/. That would require root in any decent Unix installation, or it would require me to set permissions on that folder to 775 and make all users of myapp part of the owning group. Or I could just go the lazy route, make the binary 4755 and set mount opts to suid on the filesystem where this binary resides... (ugh...).
This is no different on Windows NT based architectures. If you were so inclined, with tools like Filemon and Regmon, you could granularly set permissions in a way to install these misbehaving software so that they would work for regular users.
I know I did many times in a past life (back when I was sort of forced to do Windows systems administration... ugh... Windows NT 4.0 Terminal Server edition... what a wreck...).
Let's face it, Windows NT and Unix systems have very similar security models (in fact, Windows NT has superior ACL support out of the box, akin to Novell's close to perfect ACLs, Unix is far more limited with it's read/write/execute permission scheme, even with Posix ACLs in place). It's the hoops that software vendors outside the control of Microsoft made you go through that forced lazy users to run as Administrator all the time and gave Microsoft such headaches.
As far back as I remember (when I did some Windows systems programming), Microsoft was already advising to use the user's home folder/the user's registry hive for preferences and to never write to system locations.
The real differenc, though, is that an NT Administrator was really equivalent to the Unix root account. An OS X Administrator was a Unix non-root user with 'admin' group access. You could not start up the UI as the 'root' user (and the 'root' account was disabled by default).
Actually, the Administrator account (much less a standard user in the Administrators group) is not a root level account at all.
Notice how a root account on Unix can do everything, just by virtue of its 0 uid. It can write/delete/read files from filesystems it does not even have permissions on. It can kill any system process, no matter the owner.
Administrator on Windows NT is far more limited. Don't ever break your ACLs or don't try to kill processes owned by "System". SysInternals provided tools that let you do it, but Microsoft did not.
All that having been said, UAC has really evened the bar for Windows Vista and 7 (moreso in 7 after the usability tweaks Microsoft put in to stop people from disabling it). I see no functional security difference between the OS X authorization scheme and the Windows UAC scheme.
UAC is simply a gui front-end to the runas command. Heck, shift-right-click already had the "Run As" option. It's a glorified sudo. It uses RDP (since Vista, user sessions are really local RDP sessions) to prevent being able to "fake it", by showing up on the "console" session while the user's display resides on a RDP session.
There, you did it, you made me go on a defensive rant for Microsoft. I hate you now.
My response, why bother worrying about this when the attacker can do the same thing via shellcode generated in the background by exploiting a running process so the the user is unaware that code is being executed on the system
Because this required no particular exploit or vulnerability. A simple Javascript auto-download and Safari auto-opening an archive and running code.
Why bother, you're not "getting it". The only reason the user is aware of MACDefender is because it runs a GUI based installer. If the executable had had 0 GUI code and just run stuff in the background, you would have never known until you couldn't find your files or some chinese guy was buying goods with your CC info, fished right out of your "Bank stuff.xls" file.
That's the thing, infecting a computer at the system level is fine if you want to build a DoS botnet or something (and even then, you don't really need privilege escalation for that, just set login items for the current user, and run off a non-privilege port, root privileges are not required for ICMP access, only raw sockets).
These days, malware authors and users are much more interested in your data than your system. That's where the money is. Identity theft, phishing, they mean big bucks.
Of course, I don't know of any Linux distribution that doesn't require root to install system wide software either. Kind of negates your point there...
In contrast, an "Administrator" account on OS X was in reality a limited user account, just with some system-level privileges like being able to install apps that other people could run. A "Standard" user account was far more usable on OS X than the equivalent on Windows, because "Standard" users could install software into their user sandbox, etc. Still, most people I know run OS X as Administrator.
You could do the same as far back as Windows NT 3.1 in 1993. The fact that most software vendors wrote their applications for the non-secure DOS based versions of Windows is moot, that is not a problem of the OS's security model, it is a problem of the Application. This is not "Unix security" being better, it's "Software vendors for Windows" being dumber.
It's no different than if instead of writing my preferences to $HOME/.myapp/ I'd write a software that required writing everything to /usr/share/myapp/username/. That would require root in any decent Unix installation, or it would require me to set permissions on that folder to 775 and make all users of myapp part of the owning group. Or I could just go the lazy route, make the binary 4755 and set mount opts to suid on the filesystem where this binary resides... (ugh...).
This is no different on Windows NT based architectures. If you were so inclined, with tools like Filemon and Regmon, you could granularly set permissions in a way to install these misbehaving software so that they would work for regular users.
I know I did many times in a past life (back when I was sort of forced to do Windows systems administration... ugh... Windows NT 4.0 Terminal Server edition... what a wreck...).
Let's face it, Windows NT and Unix systems have very similar security models (in fact, Windows NT has superior ACL support out of the box, akin to Novell's close to perfect ACLs, Unix is far more limited with it's read/write/execute permission scheme, even with Posix ACLs in place). It's the hoops that software vendors outside the control of Microsoft made you go through that forced lazy users to run as Administrator all the time and gave Microsoft such headaches.
As far back as I remember (when I did some Windows systems programming), Microsoft was already advising to use the user's home folder/the user's registry hive for preferences and to never write to system locations.
The real differenc, though, is that an NT Administrator was really equivalent to the Unix root account. An OS X Administrator was a Unix non-root user with 'admin' group access. You could not start up the UI as the 'root' user (and the 'root' account was disabled by default).
Actually, the Administrator account (much less a standard user in the Administrators group) is not a root level account at all.
Notice how a root account on Unix can do everything, just by virtue of its 0 uid. It can write/delete/read files from filesystems it does not even have permissions on. It can kill any system process, no matter the owner.
Administrator on Windows NT is far more limited. Don't ever break your ACLs or don't try to kill processes owned by "System". SysInternals provided tools that let you do it, but Microsoft did not.
All that having been said, UAC has really evened the bar for Windows Vista and 7 (moreso in 7 after the usability tweaks Microsoft put in to stop people from disabling it). I see no functional security difference between the OS X authorization scheme and the Windows UAC scheme.
UAC is simply a gui front-end to the runas command. Heck, shift-right-click already had the "Run As" option. It's a glorified sudo. It uses RDP (since Vista, user sessions are really local RDP sessions) to prevent being able to "fake it", by showing up on the "console" session while the user's display resides on a RDP session.
There, you did it, you made me go on a defensive rant for Microsoft. I hate you now.
My response, why bother worrying about this when the attacker can do the same thing via shellcode generated in the background by exploiting a running process so the the user is unaware that code is being executed on the system
Because this required no particular exploit or vulnerability. A simple Javascript auto-download and Safari auto-opening an archive and running code.
Why bother, you're not "getting it". The only reason the user is aware of MACDefender is because it runs a GUI based installer. If the executable had had 0 GUI code and just run stuff in the background, you would have never known until you couldn't find your files or some chinese guy was buying goods with your CC info, fished right out of your "Bank stuff.xls" file.
That's the thing, infecting a computer at the system level is fine if you want to build a DoS botnet or something (and even then, you don't really need privilege escalation for that, just set login items for the current user, and run off a non-privilege port, root privileges are not required for ICMP access, only raw sockets).
These days, malware authors and users are much more interested in your data than your system. That's where the money is. Identity theft, phishing, they mean big bucks.
puma1552
Mar 12, 05:19 AM
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_1 like Mac OS X; en-us) AppleWebKit/532.9 (KHTML, like Gecko) Version/4.0.5 Mobile/8B117 Safari/6531.22.7)
Also FTR the 60 km radius is old news on Japanese TV, and telling us they are detecting Cesium and outright telling that it may indicate a meltdown doesn't sound like covering things up to me.
Also FTR the 60 km radius is old news on Japanese TV, and telling us they are detecting Cesium and outright telling that it may indicate a meltdown doesn't sound like covering things up to me.
The DRis
Mar 18, 12:16 PM
http://modmyi.com/forums/iphone-news/755094-t-cracking-down-mywi-tethering.html
They're bluffing and hoping to get those high data users off of their unlimited data plans by having them forget to call in and opt out. So just stay on your toes.
Dirty Mother*Bleeping* bandits.
Eff em all. Use the data.
They're bluffing and hoping to get those high data users off of their unlimited data plans by having them forget to call in and opt out. So just stay on your toes.
Dirty Mother*Bleeping* bandits.
Eff em all. Use the data.
torbjoern
Apr 24, 11:12 AM
The deal with religious people is founded in human nature; it's the need to have faith in something bigger than oneself. For some reason, the Church of Scientology comes to my mind when I'm writing this. Oh yes, here is my question: how many religions are founded on somebody's desire to exploit that need?
Lately I read that the iPhone was considered the world's greatest invention. It isn't. God is the greatest invention ever.
Lately I read that the iPhone was considered the world's greatest invention. It isn't. God is the greatest invention ever.
inkswamp
Oct 26, 03:49 AM
If history serves as a template for the future
Honestly, with Apple, history doesn't serve as much of a template for the future when you think about it.
Honestly, with Apple, history doesn't serve as much of a template for the future when you think about it.
skunk
Apr 24, 10:50 AM
I'm just entertaining the notion of agnosticism as a kind of nod to the great debt we owe Judaism and Christianity. If it wasn't for those two faiths which allowed for reformations (such a thing would be impossible under, say, Islam) then secular Western democracies would be vastly different.What do you mean by "allowed for"? Do you mean that they could have slaughtered more people in the wars of religion? As for Islam, we probably would not have had a Renaissance without Islam.
If Europe had succumbed to the advance of Islam, if Vienna had fallen in the 17th century things likely would be very different today. Europe would have produced as many Nobel Prize winners as the entire Islamic WorldWe would all be speaking German I expect.
If Europe had succumbed to the advance of Islam, if Vienna had fallen in the 17th century things likely would be very different today. Europe would have produced as many Nobel Prize winners as the entire Islamic WorldWe would all be speaking German I expect.
ATD
Sep 26, 04:33 PM
This coming year is going to be great. A MacPro with 8 cores along with UB versions of the software packages I use daily. What more could a peep like me ask for... Well, Pixar could offer mult-threading support for Renderman Maya plug-in, that would be nice. :o
Good things come to those who wait. :)
<]=)
I didn't know the Renderman Maya plug-in was not mult-threaded. I was thinking of getting it, are you saying it's only a one cpu renderer?
Good things come to those who wait. :)
<]=)
I didn't know the Renderman Maya plug-in was not mult-threaded. I was thinking of getting it, are you saying it's only a one cpu renderer?
BoyBach
Aug 30, 07:56 AM
I'm a officer (imagery information analyst) for the defence force. In my line of work I get this inanely useless "hippy crap" 24 hour a day 7 days a week
The army is full of hippies? :eek: :D
Or are you spying on hippy communes? <shifty eyes>
:D
The army is full of hippies? :eek: :D
Or are you spying on hippy communes? <shifty eyes>
:D
york2600
Aug 29, 02:59 PM
If you head over to Apple's environmental page and read through it (which I have done several times) you'll see that much of what they claim to be doing for the environment is actually more along the line of what is called natural capitalism. That's not to say it's bad, but don't let them fool you into thinking they have the environments best interests at heart. They're looking out for the bottom line. They make claims about LCDs, but manufacturing energy and toxic inputs on LCDs vs CRTs is a pretty poor argument (read LCD vs CRT report by EPA to see exact figures). Apple can claim a lot of environmental victories, but many of them are simple side benefits of the movement in their product line. LCDs use less energy and have lower cooling costs in lab environments. Core Duos take less energy than G5s. These are true, but Apple didn't switch to save the world.
Dell has come under a lot of pressure recently for their poor environmental track record. From their lack of a takeback program to their recycling of components using prison labor. They're been forced to clean up their act. They have a pretty amazing takeback program. Apple has a really horrible one. I've used both. Apple needs to step up here. They have a program that seems to exist simply so they can say it's there. Apple has also pulled products from the European market instead of redesigning them to meet new toxics standards. Dell switched suppliers and kept their products world wide. Greenpeace should be targetting Apple here. I hope Apple reacts. Good quality products, with a long lifespan a low environmental impact benefit everyone.
Dell has come under a lot of pressure recently for their poor environmental track record. From their lack of a takeback program to their recycling of components using prison labor. They're been forced to clean up their act. They have a pretty amazing takeback program. Apple has a really horrible one. I've used both. Apple needs to step up here. They have a program that seems to exist simply so they can say it's there. Apple has also pulled products from the European market instead of redesigning them to meet new toxics standards. Dell switched suppliers and kept their products world wide. Greenpeace should be targetting Apple here. I hope Apple reacts. Good quality products, with a long lifespan a low environmental impact benefit everyone.
Multimedia
Nov 2, 09:10 PM
That's the Kentsfield chip not the Clovertown (Xeon) CPU but the benchmarks are interesting.
Just as expected the Quad cores are only going to be a big improvement for the software that can utilize them. Software will catch up with multicores, hopefully by Q2 07 when I'll be buying a new machine.A significant amount of multimedia related software already will use more than two cores and can be run simultaneously to easily hose an 8-core Mac Pro now.
Just as expected the Quad cores are only going to be a big improvement for the software that can utilize them. Software will catch up with multicores, hopefully by Q2 07 when I'll be buying a new machine.A significant amount of multimedia related software already will use more than two cores and can be run simultaneously to easily hose an 8-core Mac Pro now.
AlBDamned
Aug 29, 11:24 AM
danielwsmithee is right.
At work, we never throw out a mac. But the pc boxes get replaced often.
At work, we never throw out a mac. But the pc boxes get replaced often.
digitalbiker
Sep 12, 04:55 PM
This is the device I've been waiting for 2+ years for Apple to come out with. Those who think this isn't a Tivo killer don't understand Tivo's plans. This hasn't just killed the current Tivo, this has killed the gen4 Tivo that isn't even out yet. It's stolen its thunder by at least a year if not much more.
It's been obvious for awhile now that Tivo has been moving in their slow ponderous way towards a method of content delivery over internet. They have been doing it for ads for years now, and they want to do it with content so bad they can taste it. They hired a key guy from bittorrent several years ago, but haven't done anything impressive since. They want it, but with it taking them 3 years to go with cable card and dual tuner, they just aren't able to get their act together in time.
Apple has played their cards exactly right. They've done what Tivo, Netflix, Microsoft, Sony, and Blockbuster would all give their collective left nut to do. They've done what every local cable company and even every media mogul SHOULD have been laying awake worrying about, which is to have made them irrelevant in one fell swoop. Not to every single consumer by a long shot, but to a significant demographic of tech-savvy consumers who know what they want and will shift paradigms to get it.
As much as I want this right this very second, waiting for 802.11n is the right thing to do and I'm glad Apple did it. I don't have a TV, but I'll buy a 20" monitor and one of these the day it comes out. I'll buy a second one and a projector as soon as possible afterwards.
This is going to be a much bigger deal than the iPod, and that's saying a lot.
You're crazy! Jobs just demoed a wireless replacement for a $5.00 cable that connects your computer to your TV. If you think this will change everything you're nuts!
First off Apple still has not managed to get much video content for their iTunes store.
Second, Apple has yet to supply any HD content.
Third, one of the biggest sources for high-speed broadband in the US is cable. So Apple isn't putting any cable company out of business anytime soon.
Fourth, Content providers like ABC, CBS, NBC, Fox, etc. will not make the content available to Apple until after it has been released to cable or over the air. Otherwise they will loose significant money from advertisers for exclusive airing rights content.
In otherwords, don't disconnect your cable, over-the-air antenna, or satellite antenna anytime soon.
It's been obvious for awhile now that Tivo has been moving in their slow ponderous way towards a method of content delivery over internet. They have been doing it for ads for years now, and they want to do it with content so bad they can taste it. They hired a key guy from bittorrent several years ago, but haven't done anything impressive since. They want it, but with it taking them 3 years to go with cable card and dual tuner, they just aren't able to get their act together in time.
Apple has played their cards exactly right. They've done what Tivo, Netflix, Microsoft, Sony, and Blockbuster would all give their collective left nut to do. They've done what every local cable company and even every media mogul SHOULD have been laying awake worrying about, which is to have made them irrelevant in one fell swoop. Not to every single consumer by a long shot, but to a significant demographic of tech-savvy consumers who know what they want and will shift paradigms to get it.
As much as I want this right this very second, waiting for 802.11n is the right thing to do and I'm glad Apple did it. I don't have a TV, but I'll buy a 20" monitor and one of these the day it comes out. I'll buy a second one and a projector as soon as possible afterwards.
This is going to be a much bigger deal than the iPod, and that's saying a lot.
You're crazy! Jobs just demoed a wireless replacement for a $5.00 cable that connects your computer to your TV. If you think this will change everything you're nuts!
First off Apple still has not managed to get much video content for their iTunes store.
Second, Apple has yet to supply any HD content.
Third, one of the biggest sources for high-speed broadband in the US is cable. So Apple isn't putting any cable company out of business anytime soon.
Fourth, Content providers like ABC, CBS, NBC, Fox, etc. will not make the content available to Apple until after it has been released to cable or over the air. Otherwise they will loose significant money from advertisers for exclusive airing rights content.
In otherwords, don't disconnect your cable, over-the-air antenna, or satellite antenna anytime soon.
TennisandMusic
Apr 21, 02:46 PM
I own 3 macs and 5 advices. I have a PhD in electrical engineering and designed microprocessors for 14 years, including microprocessors used in many PCs. I've written millions of lines of source code in C, assembler, C++, etc.
And most of the folks I know who use Linux or solaris all day at work to design chips use macs at home and carry iPhones. I don't know a single one of them who uses an android phone (many carry blackberries however).
Just out of curiosity, why do you suppose that is? The *NIX family? Or something else? I'd like to hear your perspective.
And most of the folks I know who use Linux or solaris all day at work to design chips use macs at home and carry iPhones. I don't know a single one of them who uses an android phone (many carry blackberries however).
Just out of curiosity, why do you suppose that is? The *NIX family? Or something else? I'd like to hear your perspective.
AP_piano295
Apr 23, 12:35 AM
I don't think atheism is a belief system, but it requires belief. Not believing in a god requires believing there isn't a god. You could say I'm just twisting words there.
I agree on all your points. I just can't bring myself to completely deny the existence of god, not through fear, but through fear.. of insulting my own intelligence. We can't prove god exists or doesn't exist, it seems impossible that we ever will. So I don't deny the existence of god, I do think it's unlikely and illogical, hence why I lean towards atheism (agnostic atheist).
Here's a hypothetical question:
Do you believe in witches? (I assume the answer is no)
Now we don't have a special word for people who don't believe in witches. You probably wouldn't claim that not believing in witches requires belief.
Now the fact that you don't believe in those things doesn't necessarily preclude their existence. You just don't believe in them, because I imagine nothing in your life experiences or in the evidence you have been presented suggests that true witches exist. Would you say that this viewpoint requires belief?
Do you think it's possible that you give religion and god undue weight and consideration because so many others believe in him/her/it and you have a hard time believing that so many people could be so totally wrong?
I agree on all your points. I just can't bring myself to completely deny the existence of god, not through fear, but through fear.. of insulting my own intelligence. We can't prove god exists or doesn't exist, it seems impossible that we ever will. So I don't deny the existence of god, I do think it's unlikely and illogical, hence why I lean towards atheism (agnostic atheist).
Here's a hypothetical question:
Do you believe in witches? (I assume the answer is no)
Now we don't have a special word for people who don't believe in witches. You probably wouldn't claim that not believing in witches requires belief.
Now the fact that you don't believe in those things doesn't necessarily preclude their existence. You just don't believe in them, because I imagine nothing in your life experiences or in the evidence you have been presented suggests that true witches exist. Would you say that this viewpoint requires belief?
Do you think it's possible that you give religion and god undue weight and consideration because so many others believe in him/her/it and you have a hard time believing that so many people could be so totally wrong?