Welcome Guest ( Log In | Register )

3 Pages  1 2 3 >Bottom

Outline · [ Standard ] · Linear+

 AMD Bulldozer & Bobcat

views
     
Najmods
post Aug 26 2010, 11:33 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(davidbilly87 @ Aug 26 2010, 10:14 AM)
Only have 2 cores?
*
No, they just wanted to show the core architecture so they just show one 'block' of the core

QUOTE(jeopardise @ Aug 26 2010, 11:07 PM)
AMD taking a risk to introduce a new architecture with 32nm silicon. Hope its works.
*
Its not a risk, they MUST create new architecture to compete with Intel CPUs. Their desktop its pretty good but their mobile segment are pretty poor as compared to Intel counterparts, both in performance and battery life

The Bulldozer architecture is pretty radical, as it is not a 'true' dual core as it share fetch and decode blocks, FPU and L2 caches and claiming its '80% performance of true dual core' to reduce thermal and power envelope. I hope there is more to that.
Najmods
post Aug 27 2010, 11:59 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(jeopardise @ Aug 27 2010, 06:33 AM)
Well from your reply still sounds like AMD taking RISK, because they MUST (take risk to) create new architecture to compete with Intel's 32nm technology tongue.gif No denying about that. They also risks delaying 32nm transition where Intel alreay ahead of AMD. So far performance scaling looks promising but it is based on simulation based on architecture design.

The longer Bulldozer is delayed, the greater the chance that it'll debut into the teeth of new six-core and eight-core Sandy Bridge products.
*
Simulated are nothing compared to real life performance. At least I would like to see L2/L3 cache sizes and clockspeed. They delaying far too long, unless they have something good up their sleeve
Najmods
post May 30 2011, 04:25 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(yinchet @ May 30 2011, 03:59 PM)
It seem like they can't get desirable performance...?? hmm.gif
*
Its better be late than never. I sure people don't want another TLB bug on earlier B2 stepping Phenoms
Najmods
post May 30 2011, 11:51 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(adie82 @ May 30 2011, 11:16 PM)
what is DNF stand for? hmm.gif
*
DNF stand for Did Not Finish but I guess he means Duke Nukem Forever, the most famous 'vaporware' ironically with the same acronym
Najmods
post Jun 29 2011, 09:32 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(billytong @ Jun 29 2011, 08:30 PM)
exactly back then regular Pentium 1/AMD k6 need just a small HSF to cool down, then we reach to Athlon Thunderbird, like everyone is complaining about its high 60-70w TDP. biggrin.gif

From practical point of view, I think a laptop with AMD E350+ with a standard size 12 cell battery ultra long battery life is a lot better. I really like to go long travel with my laptops without cables..

And then none of the laptop companies have make the power adapter build-into the laptop casing. How many time u end up carrying the heavy adapter with long thick cables around u when u carry ur laptop?
*

That's the problem, when manufacturing process become smaller they just abusing it with more power instead of making it to be power efficient.

Having 12 cell battery + build in power brick inside a laptop is not what you call portable. Small netbook power brick might be small but incorporate it inside a 10 or 12 inch body will be a challenge. Have you seen the size of typical 12 cell battery? It usually protrude underneath the laptop and very heavy
Najmods
post Jun 30 2011, 02:08 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(billytong @ Jun 30 2011, 12:43 AM)
and carrying a separate adapter is call portable?

I say it is a lot more troublesome to carry a separate adapter around. Laptop battery life isnt long enough to let us leave the recharging cable in home while we carrying our laptop around. So the extra weight of the adapter here wouldnt matter, because 90% of the time we will end up carrying it around. You dont call compact and portable when u have to carry adapter around.

I keep seeing the AMD roadmap are offering bobcat isnt really "power efficient". 18w is no way near to be considered low power so are 9w bobcat, it is knowonder the long hours ARM base smart phones & the tablet PC taking up the portable market share.
*

You don't see the whole picture, lets put it like this: Imagine putting it into the laptop and have to carry it ALL the time even when not plugged in, for small netbook it might be 100-200 grams but for big laptop it such a nuisance. Plus when charging imagine the heat it produces. You don't want another heat source cooking your lap or your palm aren't you? Harddisk, GPU and CPU is put out enough heat already, you already pointed out the wattage of the CPU, that's not included other components, chipset, PWM on the mobo etc. All of this producing heat. Heat is number one problem in laptop and also could damage the battery in long run.

Having x86 and all its capability does have its drawback, its not as power saving as ARM CPU but then again those ARM processor is not comparable to laptop and desktop x86 CPU in terms of performance and functionality so it silly to compare its battery life. See that Intel do to Atom, the dropped OoO in favor of In order and the performance drops by a lot, whats the point of having long battery life if the task you gonna do take longer time as well? Gain something lose something. AMD has just started with low power CPU, and it is fruitful as compared to Atom, far better performance and comparable battery life. Its just the start, expect it to get better on next iteration.
Najmods
post Jun 30 2011, 11:27 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(dma0991 @ Jun 30 2011, 09:30 AM)
Unlike Atom, The Bobcat family is quite flexible. It is just a plain 40nm process Zacate and yet it has been tweaked over and over again on the same chip till something even lower power like the Ontario and Desna comes out. Now imagine what 28nm and more Bobcat cores can do for it.

3-4 days is insane, you're going against physics itself or at least the current capacity of battery technology. Even with an ARM, something like the iPhone 4 wouldn't even last a whole day if it is constantly used 24 hours. If ARM can't even achieve 3-4 days of constant use(not standby time), you can't expect x86 to do the same.
*

Maybe he wanted this, a 32 hour battery life HP EliteBook 8460p with its 'portable' 120W docking station. Laptop weight in at 2.07kg if it uses standard 6-cell battery, the docking station itself weight at 'light' 1.637kg! There you are, power brick free laptop just for you. Mind you the rated 32-hour is with WiFi in OFF state, and presumably in lowest brightness and in idle

user posted image

Seriously, unless there is a breakthrough in manufacturing process to reduce leakage in transistor or breakthrough battery technology such as carbon nanotube battery, he just asking the impossible.
Najmods
post Jun 30 2011, 07:20 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(billytong @ Jun 30 2011, 03:18 PM)
it is not asking the impossible, I am just saying the industy seems to be focus on delivering more performance than improving TDP when addressing this portion of the market, wheres we know that at certain speed we are fast enough for basic websurfing/word document.
*
Fusion concept IS the answer to your statement. Problem is consumer demand smooth Flash videos and Flash heavy site, 1080i capable video playback for home theater, capable GPU for mainstream games and multitasking and not just basic stuff. If not you just go back to Atom then, yes it is enough for your stuff, but do you do just that all the time? Do you just websurfing and spreadsheet all day without visiting flash site or playing games? AMD try to delivers good performance for day-to-day usage but with minimal power draw. But this is their first step, technology needs time to grow.
Najmods
post Oct 17 2011, 09:41 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


I don't know if this been posted before, I came upon this stating the real culprit could be the test kit motherboard, the Crosshair V Formula.
Najmods
post Nov 23 2011, 09:16 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


I don't like where AMD is heading, they had gone back to megahertz race. The common conception that 'AMD is hot' back during the Thunderbird era is going to be back because of this.

Although it doesn't matter much in desktop where you could put big cooler on them and reach very high speeds, but they won't appeal much for mobile gamer.

Just look at current AMD Sabine platform, with only 2.1GHz tops (with on our market it is usually have low based 1.4GHz clock quad core) it won't be able to perform in games because as we could see from similarly architectured desktop Deneb based Phenom II needs 3GHz and above to reach 25fps minimum, like for example Skyrim even with GTX 570 as per review here.

The only saving grace for AMD is because it based on k10 stars architecture people can use k10stats to overclock, some with modded cooling could reach 3GHz speeds as you could read here. Also Crossfire between integrated and dedicated GPU don't work in some games as well, lowering performance instead of increasing them.

I don't see how Trinity could be better than current Llano on laptop, the GPU could be faster, that's why its only 3D benchmark is been floating around, but what about the CPU? What are the clocks might be? With base clock of 3+GHz on desktop they better do something on mobile to make it appeal for gamers because high speed don't equal to low temperature or low power consumption.

AMD could do a lot of things with their Bobcat, really. Just put one in Ultrabook chassis and voila! Affordable Ultrabook.
Najmods
post Nov 23 2011, 12:03 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(jonchai @ Nov 23 2011, 11:45 AM)
There's nothing wrong with Megahertz race really, especially with current technology where die shrinks even smaller. Ultimately, fabs would come to a dead end, which was predicted to be at about 8nm or 4? I can't really remember. Instructions per clock cycles matter, the faster it is, the faster things get done. AMD is already some time ahead of what our current technology could handle, in terms of multi-threaded processing, based on BD. Sure, Sandy may seem faster on paper, but BD isn't a slouch either in terms of multi threaded processing. Give AMD some time to fine tune and GloFo to sort out their problems, PileDriver may just be the next true FX chip.
*
IPC don't equal to fast clock, what it should do is to execute instruction in fewer cycle as possible and done more work per clock cycle thus reducing the needs of high clockspeed and lowering power consumption. The reason why Netburst fails because they were aimed squarely at high clock instead of high IPC. It was predicted to reach 10GHz but where does it stop? Not even half of that. Check Netburst Rapid Execution Engine, it runs twice the clock as the CPU runs, which means Arithmetic Logic Units runs at 6GHz if the CPU clock is 3GHz but does it prove faster than lower clocked K8? AMD really needs to do another K8 to really gives Intel a run for its money. The situation is reversed now. Like I said high clock have huge drawback, mainly power consumption and heat. Other comes in the form of bad yield due to difficulty of manufacturing complex core.

By the time Pliedriver is out, Intel already have Ivy Bridge ready. Remember Intel have process node advantage over AMD.

Unfortunately hope alone won't helps AMD by much. But time will tell whether an improved thread scheduling in Windows 8 will improve its performance as most people hoped or not.
Najmods
post Nov 23 2011, 12:53 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


For those who don't read thread scheduling advantages if it done correctly, read here
Najmods
post Nov 23 2011, 05:42 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


Well, the thread scheduling is done so to trigger higher Turbo Core clock because of it uses fewer modules. So it could simply be done by overclocking manually anyway.

But I like this quote in particular near the end of the article, it could be beneficial to AMD somewhat

QUOTE
Trouble is, right now, Intel has much better OS and application support for Hyper-Threading than AMD does for Bulldozer. In fact, we're a little surprised AMD hasn't attempted to piggyback on Intel's Hyper-Threading infrastructure by making Bulldozer processors present themselves to the OS as four physical cores with eight logical threads. One would think that might be a nice BIOS menu option, at least. (Hmm. Mobo makers, are you listening?)

Najmods
post Dec 2 2011, 01:53 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(tech3910 @ Dec 2 2011, 12:21 PM)
it's ironic that how AMD graphic division mock nvidia power consumption while their CPU division is f***ing power hungry.
*
It's not ironic, not sure if you serious but look back during Intel Pres-hott era, their mobile segment excels in both performance and power consumption, even beating desktop CPU despite having low clockspeed. Reason is simple, because its on different division
Najmods
post Dec 4 2011, 05:26 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(jonchai @ Dec 4 2011, 07:27 AM)
I believe the 2b transistor count was meant for Trinity with an on-die graphics processor, but their marketing people screwed up.
*
Don't be so sure, read this
Najmods
post Aug 23 2012, 09:53 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


It's not a fail product by a long shot. It's just there is several things that put in Intel favor, one being Intel compiler crippling AMD CPU when it detects one, and Bulldozer architecture are geared toward multi threading (workstation performance) while most software, including games, favors high IPC. Bulldozer is just a base for things to come from AMD, which to fuse GPU and CPU, the current APU is not the final product because it's not fully assist CPU in some task.

Intel got one advantages over AMD which is finer manufacturing process, but it too have limits. Look at Ivy Bridge, which it supposed to give better performance than SandyBridge but in fact only a few percent faster and runs hotter as well, even with IHS removed.
Najmods
post Sep 29 2012, 04:17 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


Don't know if it had been posted or not. Good news for someone who bought AM3+ mobo like me recently, looks like Steamroller coming to AM3+ after all. 5 years warranty mobo with support for upcoming CPU is big win, one of few reason I stick with AMD.

AMD sticks with Socket AM3+ for Steamroller.
Najmods
post Oct 6 2012, 12:48 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(djlah @ Oct 6 2012, 12:34 PM)
@nill
thx bro, for the newegg quote sharing.
will wait lingloong mb update first. btw, do you know Trinity crossfire matrix? would like to know A10-5800K can pair with 7750? or only lower range?

nvidia just introduce GTX 650, AMD quickly announced price cut. now wait for all brands revise the pricing on graphic card.
*
All Trinity APU uses VLIW4 of previous HD 6k series cards, so you can only CF with VLIW4 based card of previous HD 6k series only, something like HD 6670. Wonder why they use HD 7k numbering, probably for marketing purposes.
Najmods
post Oct 6 2012, 01:37 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(k!nex @ Oct 6 2012, 01:11 PM)
HD6k series card has only 1 die using the VLIW4 architecture. Only the Cayman die (69xx series). Others all using VLIW5 technology including 6670. 6670 inherited alot from the old 5670.
*
True, but from the reviewer who tested this CPU claimed it could only be CF with HD 6k GPUs only, at least HD 7660D is.
Najmods
post Oct 12 2012, 11:24 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(coolstore @ Oct 9 2012, 08:36 PM)
i'm looking for 35W desktop CPU, too bad AMD not having this sort product line now. E450 those are really power saving but performance no that good. now finding Intel T/S series power efficient cpu.
*
You don't have to go Intel, all modern AMD CPU can use software called AMD PSCheck to control voltages and clockspeeds. You can simply achieve a lot lower than 35W you after. My Phenom II idles at 13W (according to HWMonitor) and 7W (according to extreme PSU calculator) as it idles 800MHz with very low 0.65V.

3 Pages  1 2 3 >Top
 

Change to:
| Lo-Fi Version
0.0547sec    0.48    7 queries    GZIP Disabled
Time is now: 27th November 2025 - 10:45 PM