dongmin
Sep 19, 10:02 AM
It gets annoying. Why? Because it's true and most people don't want to admit it.
In a few cases here and there, the extra processor power/speed is going to help. But for a majority of people buying a MacBook, they're not going to be burning home-made DVD's, doing intense Music compositions, or using it for hard-core gaming. They're going to SURF and WRITE.
As for the "resale" value, again, most people who are buying a used MacBook are NOT going to ask "is it a Merom?" They're going to ask how nice the case is, how much use it's gotten, and how much it is, and that's it.
Everybody likes to play "ooo, I'm the hard-core computing whiz and I need the BEST out there", but I bet you if you took an honest poll out there of everyone who's answered this thread, you'd find at least 75% these Apple fans have no need for for the extra speed, they just want it because it's "cool" and "fast" and it's the latest thing out there.While you make some valid points, you overlook others:
1. As soon as the new model comes out, the older models will drop in price. So even if you aren't getting the fastest and greatest, even if you're buying the lowest end MBP, you'll benefit from the price break.
2. MBPs are expensive computers. You're investing in something that you'll keep around for 3-4 years. I want to future-proof my computer as much as possible. Features like easily-swappable HD and fast graphics card will affect "the average user" 2+ years from now (pro'ly sooner) when everyone's downloading and streaming HD videos and OS X has all this new eye-candy that will require a fast graphics card.
3. There are other features than just a 10% increase in CPU power that we are hoping in the next MBP, including a magnetic latch, easily-access to HD and RAM, and better heat management. Certainly the average Joe will be able to benefit from these features, even if all you do is word process and surf the web.
In a few cases here and there, the extra processor power/speed is going to help. But for a majority of people buying a MacBook, they're not going to be burning home-made DVD's, doing intense Music compositions, or using it for hard-core gaming. They're going to SURF and WRITE.
As for the "resale" value, again, most people who are buying a used MacBook are NOT going to ask "is it a Merom?" They're going to ask how nice the case is, how much use it's gotten, and how much it is, and that's it.
Everybody likes to play "ooo, I'm the hard-core computing whiz and I need the BEST out there", but I bet you if you took an honest poll out there of everyone who's answered this thread, you'd find at least 75% these Apple fans have no need for for the extra speed, they just want it because it's "cool" and "fast" and it's the latest thing out there.While you make some valid points, you overlook others:
1. As soon as the new model comes out, the older models will drop in price. So even if you aren't getting the fastest and greatest, even if you're buying the lowest end MBP, you'll benefit from the price break.
2. MBPs are expensive computers. You're investing in something that you'll keep around for 3-4 years. I want to future-proof my computer as much as possible. Features like easily-swappable HD and fast graphics card will affect "the average user" 2+ years from now (pro'ly sooner) when everyone's downloading and streaming HD videos and OS X has all this new eye-candy that will require a fast graphics card.
3. There are other features than just a 10% increase in CPU power that we are hoping in the next MBP, including a magnetic latch, easily-access to HD and RAM, and better heat management. Certainly the average Joe will be able to benefit from these features, even if all you do is word process and surf the web.
ChickenSwartz
Aug 26, 06:18 PM
I have just ordered a mbp :( It wasn't supposed to ship until Monday but it shipped early :( If the rumors are true will I be able to send it back and get the new one? Has anyone had any experience in returning unwanted stuff to apple as time is not on my side (leave for uni on the 16th Sept)
I never have but I hear it is pretty easy if you DON'T OPEN THE BOX.
The only change is likely to be the cpu. The rest of the MBP will probably be kept the same and if you look at the yonah vs merom benchmarks at places like AnandTech, it probably isn't worth sending it back.
It has been rumored that there might some minor changes to the computer such as easilly removable HD, differnt latch, as said above. But more importantly, I hope they bump the clock speeds and include 1GB RAM as standard on lowest MBP model for the same price.
I never have but I hear it is pretty easy if you DON'T OPEN THE BOX.
The only change is likely to be the cpu. The rest of the MBP will probably be kept the same and if you look at the yonah vs merom benchmarks at places like AnandTech, it probably isn't worth sending it back.
It has been rumored that there might some minor changes to the computer such as easilly removable HD, differnt latch, as said above. But more importantly, I hope they bump the clock speeds and include 1GB RAM as standard on lowest MBP model for the same price.
milo
Aug 17, 09:21 AM
You're right. I'm extremely unimpressed that the fastest xeon only days old is actually slower mhz for mhz than a G5 that is pushing 4 year old technology. Really sad.
But overall it's not. Whenever you change chips, you'll probably always find a benchmark that favors the old one. Just because one app isn't faster doesn't mean the new chip is slower.
But it's not faster. Slower actually than the G5 at some apps. What's everyone looking at anyway? I'm pretty unimpressed. Other than Adobe's usage of cache (AE is a cache lover and will use all of it, hence the faster performance).
But the actual xeon processors are only as fast as the G5 processors. Look at the average specs... the 2.66 machines are only a teeny bit faster than the G5s except in a few apps like filemaker. But not in the biggies like Final Cut Pro where it actually appears that mhz for mhz the G5 is a faster machine hands down!
What are you talking about? The xeon is faster in every native benchmark, the only exception is one render where the slower xeon tied the G5. If you do indeed look at the average specs, the xeons blow away the G5.
Looks like the Xeons got killed by the G5 in Word in their tests.
Because it's running under rosetta, ram has nothing to do with it.
It's odd, seeing as Mac's are still the choice for many musicians that some kind of specs are never given that would be of interest to musicians. The released figures don't do much for me. I'd like to know the polyphony improvements say for Kontakt under both systems in Digital Performer 5.
There have been Logic benchmarks elsewhere, and they're pretty impressive. 1.4-1.5x improvements, pretty nice considering how fast the quad is already for audio plugins.
But overall it's not. Whenever you change chips, you'll probably always find a benchmark that favors the old one. Just because one app isn't faster doesn't mean the new chip is slower.
But it's not faster. Slower actually than the G5 at some apps. What's everyone looking at anyway? I'm pretty unimpressed. Other than Adobe's usage of cache (AE is a cache lover and will use all of it, hence the faster performance).
But the actual xeon processors are only as fast as the G5 processors. Look at the average specs... the 2.66 machines are only a teeny bit faster than the G5s except in a few apps like filemaker. But not in the biggies like Final Cut Pro where it actually appears that mhz for mhz the G5 is a faster machine hands down!
What are you talking about? The xeon is faster in every native benchmark, the only exception is one render where the slower xeon tied the G5. If you do indeed look at the average specs, the xeons blow away the G5.
Looks like the Xeons got killed by the G5 in Word in their tests.
Because it's running under rosetta, ram has nothing to do with it.
It's odd, seeing as Mac's are still the choice for many musicians that some kind of specs are never given that would be of interest to musicians. The released figures don't do much for me. I'd like to know the polyphony improvements say for Kontakt under both systems in Digital Performer 5.
There have been Logic benchmarks elsewhere, and they're pretty impressive. 1.4-1.5x improvements, pretty nice considering how fast the quad is already for audio plugins.
Kranchammer
Apr 6, 04:43 PM
You both ignored HOT DOGS! Sheesh, hot dogs rule. The only problem is kids under 6 choking on them unless you cut them right. But that will be fixed in the v3.0 hot dog, they will come pre-sliced.
GTFO. :mad:
Or are you counting on the deal with that swedish sausage company to save hot dogs from doooom?
The race to the bottom continues...
GTFO. :mad:
Or are you counting on the deal with that swedish sausage company to save hot dogs from doooom?
The race to the bottom continues...
AppleScruff1
Apr 20, 02:13 AM
Why do you keep countering an argument that no one is actually making?
Straw man fail.
Not at all. I'm only showing where Apple has done what they don't like being done to them. Only a die hard defends them at all cost.
Straw man fail.
Not at all. I'm only showing where Apple has done what they don't like being done to them. Only a die hard defends them at all cost.
milozauckerman
Jul 27, 06:49 PM
looking at reference systems - for $2049, Gateway's Core 2 Duo gets the 2.4GHz/4MB L2 cache Conroe, 2GB of RAM from the factory, an x1900 512MB graphics card, 320GB hard drive, card reader and DL DVD burner.
Apple had better step its game up compared to the prices/specs rumored last week. A weak graphics card and 512MB of RAM aren't going to cut it in the low tower, even if it does have XEON INSIDE or whatever the marketing pitch will be to distract us.
EDIT: Dell would be even cheaper, with a lesser video card, but there doesn't seem to be a way to separate the XPS 410 from the included 20in monitor as of now.
Apple had better step its game up compared to the prices/specs rumored last week. A weak graphics card and 512MB of RAM aren't going to cut it in the low tower, even if it does have XEON INSIDE or whatever the marketing pitch will be to distract us.
EDIT: Dell would be even cheaper, with a lesser video card, but there doesn't seem to be a way to separate the XPS 410 from the included 20in monitor as of now.
satty
Jul 20, 08:38 AM
Not that I wouldn't mind more processing power :D ...
but to me it doesn't make much sense for the majority of tasks/applications.
There might be rare exceptions in the professinal area and of course it makes lots of sense for a server, but for a single user machine?
Whatever, bring them on... in this case I like to be proven wrong.
but to me it doesn't make much sense for the majority of tasks/applications.
There might be rare exceptions in the professinal area and of course it makes lots of sense for a server, but for a single user machine?
Whatever, bring them on... in this case I like to be proven wrong.
bbplayer5
Mar 31, 03:19 PM
Android > iOS. This just makes it even better that they are going to tighten up with providers are doing to bend over the consumer.
Deflorator
Mar 31, 03:32 PM
What the heck is this? The "Steve was right" month?
Pathetic Dell and HP, desperate Microsoft, Samsung aka Mr. "Smoothbastic", Google inhibiting fragmentation, the very one, which does NOT exist, really...
who is next? Oh, i have got it - Adobe. So come on, resistance is futile.
Pathetic Dell and HP, desperate Microsoft, Samsung aka Mr. "Smoothbastic", Google inhibiting fragmentation, the very one, which does NOT exist, really...
who is next? Oh, i have got it - Adobe. So come on, resistance is futile.
Stella
Aug 6, 08:24 AM
Thats great news. I was wondering if a 6 week old machine was going to be left in the dust by the new chips. Santa Rosa april 2007?
Another sad person who is worried about their machines not being top of the line :-\
Its a computer, you should expect your machine to be superceded by another model in a matter of weeks / months.
Apple are a business and not to make you feel, somehow, superior due to your computer.
Another sad person who is worried about their machines not being top of the line :-\
Its a computer, you should expect your machine to be superceded by another model in a matter of weeks / months.
Apple are a business and not to make you feel, somehow, superior due to your computer.
jicon
Aug 17, 01:02 AM
Lots of stuff on Anandtech about the poor memory performance on the Intel chipset.
Looks like the Xeons got killed by the G5 in Word in their tests.
Might be an interesting machine when/if the motherboard chipset/ memory performance issue is looked in to.
I think part 3 of their review will be telling, paring the machine up to XP machines in a variety of tests.
Looks like the Xeons got killed by the G5 in Word in their tests.
Might be an interesting machine when/if the motherboard chipset/ memory performance issue is looked in to.
I think part 3 of their review will be telling, paring the machine up to XP machines in a variety of tests.
Mikey7c8
Mar 31, 08:47 PM
John Gruber's take:
So here�s the Android bait-and-switch laid bare. Android was �open� only until it became popular and handset makers dependent upon it. Now that Google has the handset makers by the balls, Android is no longer open and Google starts asserting control.
Andy Rubin, Vic Gundotra, Eric Schmidt: shameless, lying hypocrites, all of them.Can't say I disagree.
I completely disagree.
Going open sounded like a great idea in the beginning. Fast forward to today, and manufacturers have used the openness against the platform by creating custom versions of android that aren't readily upgradable.
This has hurt the platform more than 'being open' helped it and google is right to start regulating what can and cannot be done.
I think we're all pretty lucky to have experienced both sides of the spectrum to be honest :)
So here�s the Android bait-and-switch laid bare. Android was �open� only until it became popular and handset makers dependent upon it. Now that Google has the handset makers by the balls, Android is no longer open and Google starts asserting control.
Andy Rubin, Vic Gundotra, Eric Schmidt: shameless, lying hypocrites, all of them.Can't say I disagree.
I completely disagree.
Going open sounded like a great idea in the beginning. Fast forward to today, and manufacturers have used the openness against the platform by creating custom versions of android that aren't readily upgradable.
This has hurt the platform more than 'being open' helped it and google is right to start regulating what can and cannot be done.
I think we're all pretty lucky to have experienced both sides of the spectrum to be honest :)
AidenShaw
Aug 23, 08:15 AM
My Quad G5 is dead silent all the time. Those noisy Quads should have been sent off for repair. I was told the Quad in the Santa Clara Apple Store was also very loud. That is not normal. Properly serviced they run very silent.
dbA ? A system with 9 fans isn't going to be silent, period.
Are your systems in a room with a lot of ambient noise? (A wind-tunnel G4 sounds quiet at Best Buy, yet in my den I can clearly hear the fluid-bearing drive in my Yonah dual... ;) )
dbA ? A system with 9 fans isn't going to be silent, period.
Are your systems in a room with a lot of ambient noise? (A wind-tunnel G4 sounds quiet at Best Buy, yet in my den I can clearly hear the fluid-bearing drive in my Yonah dual... ;) )
gnasher729
Aug 26, 06:09 PM
I believe the 2.33 GHz Merom chip debuted at the same price as the 2.16 GHz Yonah when it was released. The prices of MBPs certainly haven't fallen. Apple has just been enjoying the extra profits from Intel's price drops of the past few months.
At that time, Apple upgraded all MacBook Pros to the next faster chip without changing prices.
At that time, Apple upgraded all MacBook Pros to the next faster chip without changing prices.
Multimedia
Jul 27, 01:55 PM
Well it's back to the future for all of us. Remember when the Mac was going 64-bit with the introduction of the G5 PowerMac on June 23, 2003? :rolleyes: Only more than three years later and we're doing it all over again thanks to Yonah's 7 month retrograde.
tk421
Apr 5, 06:10 PM
Really? And yet, it seems to be good enough for the top directors in the industry.... some of the recent Academy nominated films were all edited on Final Cut, including the Cohen Brothers' "True Grit", and "Winter's Bone". Also, David Fincher and Francis Ford Coppola used FCP on their last films... these are all people that have access and can afford cutting their films on AVID and yet, they recently choose Final Cut Pro... so why do people even question it? :rolleyes:
It's good enough for a few top directors in the industry, but not very many. They are the exception, not the rule.
Final Cut needs better media management, and also Avid-like support for multiple editors on a single project. I like Final Cut a lot, but Avid has some clear advantages for a feature film. Here's hoping this next version has some big new features!
It's good enough for a few top directors in the industry, but not very many. They are the exception, not the rule.
Final Cut needs better media management, and also Avid-like support for multiple editors on a single project. I like Final Cut a lot, but Avid has some clear advantages for a feature film. Here's hoping this next version has some big new features!
Thor74
Apr 19, 02:21 PM
Apple better not win this case and anyone who thinks that they should are a fool.
I'm doing my fool dance right now...
We can dance if we want to
We can leave your friends behind
'Cause your friends don't dance and if they don't dance
Well they're no friends of mine
I say, we can go where we want to
A place where they will never find
And we can act like we come from out of this world
Leave the real one far behind
And we can dance :D
I'm doing my fool dance right now...
We can dance if we want to
We can leave your friends behind
'Cause your friends don't dance and if they don't dance
Well they're no friends of mine
I say, we can go where we want to
A place where they will never find
And we can act like we come from out of this world
Leave the real one far behind
And we can dance :D
hadleydb
Aug 17, 01:15 PM
I need one... or is it more of a want? Need.:eek:
Amazing Iceman
Mar 31, 10:02 PM
I've really loved my experience with Android so far. I've had an iPhone and a iPhone 3G and I am an iPhone developer.... yet I use Android.
Android will always be "open source" and this is not inconsistent with Google applying more control to stem inoperable fragmentation. These two ideas are not at odds.
I cannot wait for Google to do what I think Amazon is currently trying to do with their new App. Store.
That said I really like the new iPad 2, but sadly my next purchase would prolly be a i7 MacBook Pro.
Just a quick question, hopefully not out of topic:
Which one do you prefer as a developer, not as a user: iOS or Android?
Good choice about the MBP i7. It's been over 3 years since I got my MBP, and it's time to replace it, but I may get an i7 iMac instead, as I now carry my iPad everywhere.
If a really good MBP comes out, I may reconsider and get one instead of the iMac. Too soon to decide.
Android will always be "open source" and this is not inconsistent with Google applying more control to stem inoperable fragmentation. These two ideas are not at odds.
I cannot wait for Google to do what I think Amazon is currently trying to do with their new App. Store.
That said I really like the new iPad 2, but sadly my next purchase would prolly be a i7 MacBook Pro.
Just a quick question, hopefully not out of topic:
Which one do you prefer as a developer, not as a user: iOS or Android?
Good choice about the MBP i7. It's been over 3 years since I got my MBP, and it's time to replace it, but I may get an i7 iMac instead, as I now carry my iPad everywhere.
If a really good MBP comes out, I may reconsider and get one instead of the iMac. Too soon to decide.
zacman
Apr 19, 02:34 PM
Sigh. The iPhone is still gaining market share. Not losing market share.
You're wrong. Apple is losing marketshare for over 2 years now. Just because they are selling MORE iPhones doesn't mean they are gaining marketshare. The market grows much faster than the iPhone sales. Have a look at Nokia: In Q4/10 Nokia sold almost 7 million more smartphones but they lost about 10% marketshare. In Q1/11 Apple lost about 2% marketshare despite the fact that they sold about 2.5 million more iPhones. Just read the latest GfK numbers (needs registered account), it's all in there. NDP numbers for Q1/11 will be released next week if you trust them more.
You're wrong. Apple is losing marketshare for over 2 years now. Just because they are selling MORE iPhones doesn't mean they are gaining marketshare. The market grows much faster than the iPhone sales. Have a look at Nokia: In Q4/10 Nokia sold almost 7 million more smartphones but they lost about 10% marketshare. In Q1/11 Apple lost about 2% marketshare despite the fact that they sold about 2.5 million more iPhones. Just read the latest GfK numbers (needs registered account), it's all in there. NDP numbers for Q1/11 will be released next week if you trust them more.
Macnoviz
Jul 20, 10:14 AM
At some point your going to have deminished returns. Sure multimedia apps can take advantage of a few more cores, but I dont see Mail running faster on 4 cores, nevermind 2!
How fast do you want mail to go? The main reasons you need good processors is not for browsing, e-mail, text, and such and such. I highly doubt someone who does all these things on a five year old computer will be much slower than someone on a 16 GB RAM top of the line Powermac
Why don't they just call it: Big Mac.
I think that's the best name I've heard in this thread (sorry, Chundles)
How fast do you want mail to go? The main reasons you need good processors is not for browsing, e-mail, text, and such and such. I highly doubt someone who does all these things on a five year old computer will be much slower than someone on a 16 GB RAM top of the line Powermac
Why don't they just call it: Big Mac.
I think that's the best name I've heard in this thread (sorry, Chundles)
Warbrain
Aug 25, 02:54 PM
I suspect a large amount of the issues are stemming from the problems with the Intel Macs and people are probably calling more about these problems. I could be wrong.
But yesterday did suck. That site went down in an instant. But then again, the Apple recall got a whole lot more news coverage than the Dell recall.
But yesterday did suck. That site went down in an instant. But then again, the Apple recall got a whole lot more news coverage than the Dell recall.
epitaphic
Aug 19, 09:06 AM
Can I rotate the 2nd display 90 degrees like I can in Windows?
Short answer: Yes
Long answer: Yes you can
;)
Short answer: Yes
Long answer: Yes you can
;)
shamino
Jul 20, 09:58 AM
No I think you are confused. :) I meant "Is having more cores, lets say 8, more efficient than one big core equal in processing power to the 8 cores?"
First of all, you assume that it is possible to make "one big core equal in processing power to the 8 cores". I don't think it is possible to do this (at least not with the x86 architecture using today's technology.)
But assuming such a chip exists, the answer depends on what kind of efficiency you're thinking of.
If you mean computational efficiency (meaning the most useful processing per clock-tick), then a single big core will do better. This is because single-threaded apps will be able to use the full power (whereas multiple threads are needed to take advantagte of multiple cores.) Also, the operating system can get rid of the overhead that is needed to keep software running on the multiple cores from stepping on each other.
If you mean energy efficiency (amount of processing per watt of electricity consumed), then it could go either way, depending on how the chips are made. But given today's manufacturing processes and the non-linear power curve that we see as clock speeds are increased, the multiple-core solution will almost definitely use less power.
First of all, you assume that it is possible to make "one big core equal in processing power to the 8 cores". I don't think it is possible to do this (at least not with the x86 architecture using today's technology.)
But assuming such a chip exists, the answer depends on what kind of efficiency you're thinking of.
If you mean computational efficiency (meaning the most useful processing per clock-tick), then a single big core will do better. This is because single-threaded apps will be able to use the full power (whereas multiple threads are needed to take advantagte of multiple cores.) Also, the operating system can get rid of the overhead that is needed to keep software running on the multiple cores from stepping on each other.
If you mean energy efficiency (amount of processing per watt of electricity consumed), then it could go either way, depending on how the chips are made. But given today's manufacturing processes and the non-linear power curve that we see as clock speeds are increased, the multiple-core solution will almost definitely use less power.