mulo
Apr 25, 09:02 AM
You're 16 going 20 mph over the speed limit. You are not a COMPLETELY safe driver, not even a little.
I'm 20 going 2x the speed limit, in this case the posted limit is 80mph (my car won't go any faster...)
safe driving has nothing to do with age, in face most elderly people are utterly horrible drivers. It all has to do with attention span, (which elderly people just don't have all together) to the point, so long as no one/thing is distracting most young people are great drives.
edit: @xboxer75010 hahahahahahahah http://www.youtube.com/watch?v=ELZQ-Z6lASI
I'm 20 going 2x the speed limit, in this case the posted limit is 80mph (my car won't go any faster...)
safe driving has nothing to do with age, in face most elderly people are utterly horrible drivers. It all has to do with attention span, (which elderly people just don't have all together) to the point, so long as no one/thing is distracting most young people are great drives.
edit: @xboxer75010 hahahahahahahah http://www.youtube.com/watch?v=ELZQ-Z6lASI
SolRayz
Mar 23, 06:52 PM
All in favor of censorship...please move the hell out of this country and settle your asses in China, North Korea, or better yet Libya.
peharri
Sep 18, 07:52 AM
I'm sure I late getting into the argument, and that fanboyism depending on what network youre own will not change, but I really think GSM does have better voice quality than any other network.
(Before I begin, quick terminology comment: I'm going to avoid "CDMA" and use the term "IS-95" instead - I try to avoid using terms like "CDMA" and "TDMA" because it generally confuses people. Many think the next version of GSM, UMTS, is actually IS95, because it incorporates a CDMA air interface called W-CDMA, for instance. Others think GSM is the same thing as the D-AMPS/IS-136 system used by (the various phone companies that became) Cingular until they started moving to GSM because both have a "TDMA" air interface and IS-136 is usually called "TDMA".) In practice, UMTS and IS95 have almost nothing in common, UMTS is a revision of GSM, and GSM has almost nothing in common with IS-136. )
There's no way to compare the two. Both IS-95 and GSM implement a variety of different codecs that are provided differently by different operators. In the area I live, Cingular (GSM) tries to force many phones to use something called AMR-HR, which has "acceptable" voice quality when you have good reception, and drops to barely incomprehensable with any deterioration in signal strength. T-Mobile (GSM) clearly doesn't, and I can talk and listen to someone with both of us sounding like we're on a landline with one bar of signal. On the same phone.
Likewise, Verizon (IS-95) uses some awful bitrate codec for its network where I live (I believe they're heavily oversubscribed here) where pretty much everyone sounds like they're dying from some serious lung problem, and Sprint PCS (IS-95 too) doesn't and generally the call quality, at medium to good reception, seems pretty much ok. Sub-landline, but not seriously so.
With the variety of voice codecs the operators use, you can't really make a fair judgement merely on the basis of network technology. Either the operator's cheap, or it isn't. IS-95 was chosen by many networks on the basis that it's spectrum efficient (ie it's cheap), but on the other hand Sprint PCS was always content with call drops when I used it to handle network overloading rather than seriously compromising on call quality. Cingular's move to GSM has caused problems in that it's using a significantly less spectrum efficient technology than the technology it replaced, so Cingular's had to, in many places, hopefully temporarily, use the crappy half-rate codecs to boost capacity until it can get more towers online.
I wouldn't use voice quality as a way to judge the technologies.
(Before I begin, quick terminology comment: I'm going to avoid "CDMA" and use the term "IS-95" instead - I try to avoid using terms like "CDMA" and "TDMA" because it generally confuses people. Many think the next version of GSM, UMTS, is actually IS95, because it incorporates a CDMA air interface called W-CDMA, for instance. Others think GSM is the same thing as the D-AMPS/IS-136 system used by (the various phone companies that became) Cingular until they started moving to GSM because both have a "TDMA" air interface and IS-136 is usually called "TDMA".) In practice, UMTS and IS95 have almost nothing in common, UMTS is a revision of GSM, and GSM has almost nothing in common with IS-136. )
There's no way to compare the two. Both IS-95 and GSM implement a variety of different codecs that are provided differently by different operators. In the area I live, Cingular (GSM) tries to force many phones to use something called AMR-HR, which has "acceptable" voice quality when you have good reception, and drops to barely incomprehensable with any deterioration in signal strength. T-Mobile (GSM) clearly doesn't, and I can talk and listen to someone with both of us sounding like we're on a landline with one bar of signal. On the same phone.
Likewise, Verizon (IS-95) uses some awful bitrate codec for its network where I live (I believe they're heavily oversubscribed here) where pretty much everyone sounds like they're dying from some serious lung problem, and Sprint PCS (IS-95 too) doesn't and generally the call quality, at medium to good reception, seems pretty much ok. Sub-landline, but not seriously so.
With the variety of voice codecs the operators use, you can't really make a fair judgement merely on the basis of network technology. Either the operator's cheap, or it isn't. IS-95 was chosen by many networks on the basis that it's spectrum efficient (ie it's cheap), but on the other hand Sprint PCS was always content with call drops when I used it to handle network overloading rather than seriously compromising on call quality. Cingular's move to GSM has caused problems in that it's using a significantly less spectrum efficient technology than the technology it replaced, so Cingular's had to, in many places, hopefully temporarily, use the crappy half-rate codecs to boost capacity until it can get more towers online.
I wouldn't use voice quality as a way to judge the technologies.
LarryC
Apr 22, 06:51 PM
AMD would be producing better CPU's via increased profits if Apple chose them over Intel from the start.
I always thought that if Apple ever went with something other than PPC that they would go with AMD. Better late than never.
I always thought that if Apple ever went with something other than PPC that they would go with AMD. Better late than never.
HecubusPro
Sep 4, 07:56 PM
I'm confused. Movie downloads for $10?!? What happened to the whole "Jobs is hammered by the movie industry into movie rentals only" ?!? This CANNOT possibly mean renting a movie for $10!! :eek:
My bet is that it's low-res/iPod quality video for purchase. Apple/Steve Jobs have yet to get into the home theater business. So far it's been the mobile entertainment business only. Movie rentals (or purchase for that matter) at home theater quality is a whole other enchilada.
Watching 320x240 movie on my 42" plasma would sort of suck and not be competitive as others have metioned. Would I buy a $10 movie to watch on my iPod? mmm....probably a few to keep me entertained on the treadmill and my son entertained on roadtrips.
Rumors are rampant, but they do bring up a good point, as you do here. Who would want to watch a movie on an iPod? (Well, actually, I have and I do, but that's beside the point.)
The Appleinsider rumor at least makes sense from an itunes/tv/movie purchase standpoint. Renting would be sort of a PITA. Who would want to download a good quality movie, often taking hours or days, unless you have a lot of people torrenting at the same time, just to have it accessible for a week or so? Not me.
This will be a movie purchase service. You buy the movie, DL it from itunes, then do what you want to with it. Watch it on you computer, rip it to DVD and watch it on your TV, run it through an air tunes like device so you don't have to rip it if you don't want.
It sounds pretty interesting to me. We'll see when it happens. Regardless, the quality is going to have be pretty good for people to want to watch them on their TV's. Offering 700mb .avi rips just won't cut it.
My bet is that it's low-res/iPod quality video for purchase. Apple/Steve Jobs have yet to get into the home theater business. So far it's been the mobile entertainment business only. Movie rentals (or purchase for that matter) at home theater quality is a whole other enchilada.
Watching 320x240 movie on my 42" plasma would sort of suck and not be competitive as others have metioned. Would I buy a $10 movie to watch on my iPod? mmm....probably a few to keep me entertained on the treadmill and my son entertained on roadtrips.
Rumors are rampant, but they do bring up a good point, as you do here. Who would want to watch a movie on an iPod? (Well, actually, I have and I do, but that's beside the point.)
The Appleinsider rumor at least makes sense from an itunes/tv/movie purchase standpoint. Renting would be sort of a PITA. Who would want to download a good quality movie, often taking hours or days, unless you have a lot of people torrenting at the same time, just to have it accessible for a week or so? Not me.
This will be a movie purchase service. You buy the movie, DL it from itunes, then do what you want to with it. Watch it on you computer, rip it to DVD and watch it on your TV, run it through an air tunes like device so you don't have to rip it if you don't want.
It sounds pretty interesting to me. We'll see when it happens. Regardless, the quality is going to have be pretty good for people to want to watch them on their TV's. Offering 700mb .avi rips just won't cut it.
seedster2
Apr 16, 08:21 PM
You have to admit this thread is really funny.
How many times have we heard Apple lovers say it's not all about "specs" and the general public are not interested in "specs" and rubbish others when they say how much better spec their PC might be.
And yet, now that Apple has the high specs, all of a sudden THIS IS the most important thing.
No average consumer is ever going to notice the difference between USB3 and Thunderbolt, in fact USB3 will be better for the general user experience as it's backwards compatible.
But now, sod the typical consumer, the only thing that matters now is specs.
Oh, you have to laugh don't you :D
It is par for course.
Just like we didn't need quad core cause it was too hot for no benefit. Or we didn't need 3G in the 2007 iPhone cause WiFi was good enough. Or that we don't need LTE cause HSDPA+ is fast enough.
;)
It's something I observed as well. It's an entertaining phenomenon
How many times have we heard Apple lovers say it's not all about "specs" and the general public are not interested in "specs" and rubbish others when they say how much better spec their PC might be.
And yet, now that Apple has the high specs, all of a sudden THIS IS the most important thing.
No average consumer is ever going to notice the difference between USB3 and Thunderbolt, in fact USB3 will be better for the general user experience as it's backwards compatible.
But now, sod the typical consumer, the only thing that matters now is specs.
Oh, you have to laugh don't you :D
It is par for course.
Just like we didn't need quad core cause it was too hot for no benefit. Or we didn't need 3G in the 2007 iPhone cause WiFi was good enough. Or that we don't need LTE cause HSDPA+ is fast enough.
;)
It's something I observed as well. It's an entertaining phenomenon
G5power
Jul 14, 09:18 AM
This is good to see. High performance chips from Intel and a great design from Apple, this will be fun to see what is announced at WWDC.
spicyapple
Sep 19, 01:41 PM
I have an idea:
Sell Pirates of the Caribbean: Dead Man's Chest in a High Definition format to test the waters. I think a lot of people would buy it in HD since they already have computers capable of decoding it. Why the need to invest in an expensive HD DVD player?
Sell Pirates of the Caribbean: Dead Man's Chest in a High Definition format to test the waters. I think a lot of people would buy it in HD since they already have computers capable of decoding it. Why the need to invest in an expensive HD DVD player?
blokey
Mar 30, 12:45 PM
Agree with Microsoft.
I suppose Apple could go the route that "App" is not short for "Application" but instead is short for "Apple".
I suppose Apple could go the route that "App" is not short for "Application" but instead is short for "Apple".
Balli
Sep 5, 06:59 AM
If they release MacBook Pros, I wonder if the top end models will come with a Blu-ray option. I know people have dismissed this before but I just noticed that Sony has released "The world's first Blu-Ray disc enabled notebook." Will the 17" MBP be next?
-Squire
I guarantee that Apple will choose to put in a hybrid HD-DVD / Blue-Ray drive, rather that limit the Mac to one format... (even though they are supporters of Blue-Ray).
Also it might be a while before Apple's engineering team figures out how to fit the newly released drives into the thin MacBook pros.
-Squire
I guarantee that Apple will choose to put in a hybrid HD-DVD / Blue-Ray drive, rather that limit the Mac to one format... (even though they are supporters of Blue-Ray).
Also it might be a while before Apple's engineering team figures out how to fit the newly released drives into the thin MacBook pros.
firewood
Mar 29, 12:22 PM
It's actually a trap for Nokia. Nokia gets a substantial portion of its market share from selling low priced phones. People who buy cheap phones don't have as much money to buy apps. App developers who want to make money will develop apps for people with money who buy the more expensive (higher profit margin) smartphones. Then customers who want a rich app environment won't buy the cheap phones because they won't have as many new cool apps. They'll buy iPhones. It's a viscous circle. Apple doesn't need market share to keep printing money (and investing it in R&D and marketing for new cool products).
striker33
Apr 25, 02:12 PM
Dont understand the hate for the UB looks, mine looks seamless with the sexy AG silver border. I can understand the hate towards the fugly mirror screens though.
cmaier
Nov 13, 05:14 PM
You really think so? Three programs between these two development teams. Facebook and then these two. Yeah I see a huge tide turning right now. Please.
And the paid app didn't even sell that well.
You're talking about some hardcore Apple supporters, well known in the community, jumping ship. It ain't a good sign.
And the paid app didn't even sell that well.
You're talking about some hardcore Apple supporters, well known in the community, jumping ship. It ain't a good sign.
Surely
Apr 20, 10:24 AM
What evidence, though? Just stating it means nothing. Prove it. Show us the data from that time when it was off.
The paragraph I quoted kind of explains it.
I agree though, I'd like to see more proof if it is true.
The paragraph I quoted kind of explains it.
I agree though, I'd like to see more proof if it is true.
morespce54
Apr 4, 12:20 PM
What is your firearms experience? How many times have you been shot at? Do you think the security guard make a Hollywood head shot?
Not much to be honest but hey, that's only my 2c.
Don't loose any sleep over it! ;)
Not much to be honest but hey, that's only my 2c.
Don't loose any sleep over it! ;)
jwdsail
Sep 16, 09:49 AM
Hmmm that is an intresting thought. I saw a demo, over a year ago, of a wireles VoIP phone at Dartmouth University that did just that. They wear them around their neck or use a clip, but it was voice activated, and they actually called them their "Star Trek badges".
http://www.vocera.com/
So a quad-band gsm iPhone based on the new clip shuffle? Perfect! Speaker phone mode or BT headset .. voice activated.. Sync the phone number and voice dialing through an updated AddressBook. Perfect!
;-)
jwd
http://www.vocera.com/
So a quad-band gsm iPhone based on the new clip shuffle? Perfect! Speaker phone mode or BT headset .. voice activated.. Sync the phone number and voice dialing through an updated AddressBook. Perfect!
;-)
jwd
Susurs
Apr 22, 04:55 PM
They'd have better found a place for Nvidia or AMD GPU via PCI-E not that Thunderbolt...
Eidorian
Aug 28, 12:11 PM
http://guides.macrumors.com/Merom
kevin.rivers
Jul 14, 12:41 PM
Yup, I know Apple's marketing loves to be ridiculous. :p 95% of customers* wouldn't notice the difference. I'm one of the 5% who will notice it but its not like I'm buying one, my iMac G5 will keep me happy for another 2+ years.
*75% of statistics are made up on the spot ;)
Very nice. :D
I have to admit, they will be apart of me thats want to drop a Merom into my iMac CD. I may just do it.
AppleCare or Merom? So many choices!
*75% of statistics are made up on the spot ;)
Very nice. :D
I have to admit, they will be apart of me thats want to drop a Merom into my iMac CD. I may just do it.
AppleCare or Merom? So many choices!
res1233
May 1, 12:54 AM
Call me clumsy or whatever, but I hate the 'corners': I accidentally trigger them all the time on a frien's machine. Mostly because I use the Apple menu a lot. I DO miss the old mouse's side buttons/center button!
Experienced mac users know to assign modifier keys to those corners to prevent that. I have the top left corner set to turn my display off (good for porn), but only when the command key is pressed.
Experienced mac users know to assign modifier keys to those corners to prevent that. I have the top left corner set to turn my display off (good for porn), but only when the command key is pressed.
gnasher729
Sep 11, 07:42 AM
No, not at all.
An affinity mask sets the set of CPUs that can be scheduled. A job won't be run on another CPU, even if the assigned CPUs are at 100% and other idle CPUs are available.
And that, by the way, is why setting affinity is usually a bad idea. Let the system dynamically schedule across all available resources -- or you might have some CPUs very busy, and others idle.
Win2k3 also has "soft" affinity masks, which define a preferred set of CPUs. If all of the preferred CPUs are busy, and other CPUs are idle, then soft affinity allows the system to run the jobs on the idle CPUs - even though the idle CPUs aren't in the preferred affinity mask.
Another aspect of quad core systems like MacPro or future Kentfields: On these systems, two cores share one 4 MB cache. If an application runs on two threads, it can run on two cores on the same chip, or on two cores on different chips. Threads that run on the same chip can exchange data very quickly, because anything that is in one threads L2 cache is automatically in the other threads L2 cache, but both threads together have only 4 MB cache. Threads running on different chips cannot exchange data quickly; data that is exchanged needs to be transferred through main memory. However, _each_ chip has 4 MB cache, or 8 MB total.
In other words, some applications will run faster if using threads on the same chip, some will run faster if using threads on separate chip. It is quite hard for the OS to guess, but the application developer should have some idea.
An affinity mask sets the set of CPUs that can be scheduled. A job won't be run on another CPU, even if the assigned CPUs are at 100% and other idle CPUs are available.
And that, by the way, is why setting affinity is usually a bad idea. Let the system dynamically schedule across all available resources -- or you might have some CPUs very busy, and others idle.
Win2k3 also has "soft" affinity masks, which define a preferred set of CPUs. If all of the preferred CPUs are busy, and other CPUs are idle, then soft affinity allows the system to run the jobs on the idle CPUs - even though the idle CPUs aren't in the preferred affinity mask.
Another aspect of quad core systems like MacPro or future Kentfields: On these systems, two cores share one 4 MB cache. If an application runs on two threads, it can run on two cores on the same chip, or on two cores on different chips. Threads that run on the same chip can exchange data very quickly, because anything that is in one threads L2 cache is automatically in the other threads L2 cache, but both threads together have only 4 MB cache. Threads running on different chips cannot exchange data quickly; data that is exchanged needs to be transferred through main memory. However, _each_ chip has 4 MB cache, or 8 MB total.
In other words, some applications will run faster if using threads on the same chip, some will run faster if using threads on separate chip. It is quite hard for the OS to guess, but the application developer should have some idea.
SmalTek
Nov 14, 12:21 PM
I think that Apple doesn't have resources for decent quality review process.
App store works in a such way, that all underdog app developers want to update their apps as often as possible. A new update brings an app to the first page in its category, sorted by date (for a day or 2)
Apple does not have guts or desire to charge for reviews, and all this mess goes on. They "review" apps very formally, and I suspect that this is outsourced to India.
If Apple wants to make this right, they should include 10 or 20 reviews into the annual $100 developer fee, and charge $20-$50 for each additional review. That would greatly reduce the number of updates, and increase the quality of reviews.
I myself have several apps in the appstore, and my apps and updates were also rejected many times for formal reasons, which were totally stupid in the context of my apps.
And what's also funny, Apple suddenly rejected my critical update with a bug fix because of a piece of graphic that already was in my app for 6 months :-)
App store works in a such way, that all underdog app developers want to update their apps as often as possible. A new update brings an app to the first page in its category, sorted by date (for a day or 2)
Apple does not have guts or desire to charge for reviews, and all this mess goes on. They "review" apps very formally, and I suspect that this is outsourced to India.
If Apple wants to make this right, they should include 10 or 20 reviews into the annual $100 developer fee, and charge $20-$50 for each additional review. That would greatly reduce the number of updates, and increase the quality of reviews.
I myself have several apps in the appstore, and my apps and updates were also rejected many times for formal reasons, which were totally stupid in the context of my apps.
And what's also funny, Apple suddenly rejected my critical update with a bug fix because of a piece of graphic that already was in my app for 6 months :-)
econgeek
Apr 14, 12:21 PM
We really should be hoping that Thunderbolt succeeds and USB 3 fails. USB has always been a hack for lowest common denominator PCs and PC manufacturers who were not interested in investing in quality external communication.
USB is a poorly designed protocol, and rather than fix it, they have just extended it with USB3, and pretend like it is faster.
In real world use, USB3 is more like 2.5Gbps-- one way.
In real world use, Thunderbolt is 20Gbps-- both directions. (two 10Gbps channels)
This means Thunderbolt is effectively 20 times faster than USB3 -- if you maxed it out. Right now the two are competitive only because we don't have external devices capable of maxing out the bandwidth... but eventually we will.
I'll have to seriously considering delaying getting a new iMac until 2012 now. I don't want to be caught having to buy more expensive Thunderbolt external drives. Thunderbolt is great only if the drives are no more expensive than USB 3.0 drives.
What will be cheaper is whatever is the more popular. Thus we want Intel to delay support for USB3 and give thunderbolt time to be adopted widely. We really need to avoid another Firewire situation here, lest the entire world be held back by a crappy, second rate technology that is ubiquitous.
Look at the price difference of a USB 2 hard drive vs. Firewire- that is purely due to the USB market being bigger, it has no technological reason.
Think about the millions of people copying large files onto 1 or 2TB USB drives and how long they have to wait.... with no advantages of USB over Firewire.
USB2 is not even as fast as Firewire 400, let alone Firewire 800.
Drat, I just bought a MBP, first laptop upgrade in 4 years :( Hopefully we get a Thunderbolt-to-USB3 connector.
Those have been announced already at this weeks NAB. Apple will likely include USB3 in their laptops, though.
USB is a poorly designed protocol, and rather than fix it, they have just extended it with USB3, and pretend like it is faster.
In real world use, USB3 is more like 2.5Gbps-- one way.
In real world use, Thunderbolt is 20Gbps-- both directions. (two 10Gbps channels)
This means Thunderbolt is effectively 20 times faster than USB3 -- if you maxed it out. Right now the two are competitive only because we don't have external devices capable of maxing out the bandwidth... but eventually we will.
I'll have to seriously considering delaying getting a new iMac until 2012 now. I don't want to be caught having to buy more expensive Thunderbolt external drives. Thunderbolt is great only if the drives are no more expensive than USB 3.0 drives.
What will be cheaper is whatever is the more popular. Thus we want Intel to delay support for USB3 and give thunderbolt time to be adopted widely. We really need to avoid another Firewire situation here, lest the entire world be held back by a crappy, second rate technology that is ubiquitous.
Look at the price difference of a USB 2 hard drive vs. Firewire- that is purely due to the USB market being bigger, it has no technological reason.
Think about the millions of people copying large files onto 1 or 2TB USB drives and how long they have to wait.... with no advantages of USB over Firewire.
USB2 is not even as fast as Firewire 400, let alone Firewire 800.
Drat, I just bought a MBP, first laptop upgrade in 4 years :( Hopefully we get a Thunderbolt-to-USB3 connector.
Those have been announced already at this weeks NAB. Apple will likely include USB3 in their laptops, though.
Anaemik
Apr 19, 06:58 AM
According to the Yahoo news article, Apple was Samsung's second-largest client in 2010 after Sony Corp and was responsible for $142 billion (4%) of Samsung's revenues last year.
So Yahoo would have us believe that Samsung's revenues last year were in the region of $3.5 TRILLION???? LOL
Tell me they were responsible for 4% of a $142B total ($5.7B) and I'd have a much easier time believing it.
edit: Ahh, seems like I was just beaten to it.
So Yahoo would have us believe that Samsung's revenues last year were in the region of $3.5 TRILLION???? LOL
Tell me they were responsible for 4% of a $142B total ($5.7B) and I'd have a much easier time believing it.
edit: Ahh, seems like I was just beaten to it.