• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

What Gigabyte not told us in the press realess for the new P55A MB

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Patric99

New Member
Joined
Oct 29, 2009
Sure, Gb new P55A line up of New Mb looks good!
until your read the fine prints!!!

Only x8 for your graphic-card, Yes only x8!!! Huh Sad
Or x16 using pcie rev 1.
and rev 1 is half speed of rev 2 = 250 for rev 1 and 500 for rev 2
So only way to get full speed on the x16 (rev 2)for your graphic card is not to use usb3 or sata .
As soon as using usb3 or sata3 then you cripple the graphic card bus.
For everyone to understand what I am saying I will give example

ex1
not enabled usb3 or sata 3
x16 = 16 lanes of pcie rev 2 =16 x 500 = 8 000 speed x16 port for your Gf card

ex 2
enabled usb3 or sata 3 using rev 1 (this set up will criple the sata 3 and usb3)
x16 = x16but in rev 1 = 16 x 250 = 4 000 max speed x16 port for your Gf card


ex 3
enabled usb3 or sata 3 using rev 2
x16 will be x8 but in rev 2 = 8 x 500 = 4 000 max speed x16 port for your Gf card


Sorry this Mb do not look as good as it´s cover!!!!
Sad Sad Sad
 
Sure, Gb new P55A line up of New Mb looks good!
until your read the fine prints!!!

Only x8 for your graphic-card, Yes only x8!!! Huh Sad
Or x16 using pcie rev 1.
and rev 1 is half speed of rev 2 = 250 for rev 1 and 500 for rev 2
So only way to get full speed on the x16 (rev 2)for your graphic card is not to use usb3 or sata .
As soon as using usb3 or sata3 then you cripple the graphic card bus.
For everyone to understand what I am saying I will give example

ex1
not enabled usb3 or sata 3
x16 = 16 lanes of pcie rev 2 =16 x 500 = 8 000 speed x16 port for your Gf card

ex 2
enabled usb3 or sata 3 using rev 1 (this set up will criple the sata 3 and usb3)
x16 = x16but in rev 1 = 16 x 250 = 4 000 max speed x16 port for your Gf card


ex 3
enabled usb3 or sata 3 using rev 2
x16 will be x8 but in rev 2 = 8 x 500 = 4 000 max speed x16 port for your Gf card


Sorry this Mb do not look as good as it´s cover!!!!
Sad Sad Sad

Only post on this account and it is to bash Gigabyte?
 
I love Gigabyte, but this is ugly!
I had a regular GB P55 UD5 card on order, when I saw the press release for this card and happy as I was I cancel my order.
But then I downloaded the manual, and after reading it, yes I was realy upset.

And don´t be fooled by this is my first post, I been reading this and couple of other forums if not every day at least every week, many many years.
My First computer was a Sinclair Spectrum a computer that come out before Vic 20.

Do you think it is better for people to not know that if they buy this MB they will get a crippled x16 bus as soon as they start using usb3 and sata3????

And if you think that I am very sorry I posted this info.

My thinking, it would been better for GB to tell people about this right away.

And Guess! my next Mb will be a Gigabyte a p55 ud5 but not the new P55A and think about it! You probably would not buy a A P55A MB from Gb, Would you???

And If you Would, please enlight me why you would go for a MB with a crippled x16 pcie bus

And WonderingSoul as my other Post was my first post here , Thanks for making me fell welcome! when I provided good useful info, for evryone which was thinking about buying the new Gb P55A
 
Last edited:
To translate the post a bit, basically these mobos use the CPU PCIe lanes for the SATA3 and USB3.0 controllers. What that means is if you enable SATA3/USB3.0 the main graphics slot drops from PCIe 2.0 16x to PCIe 2.0 8x, or to put in another way using SATA3/USB3.0 has the same effect as putting a card in the secondary physical PCIe 16x graphics slot.
 
And If you Would, please enlight me why you would go for a MB with a crippled x16 pcie bus
Because a person may not game much or the fact that the PCIe2.0 slot at 8x wont hold back any videocard except for the HD5870 and thats only by 1-2%. ;)



This isnt really as big a deal as you make it out to be (or as I interpreted your posts). Im sure something will be mentioned about it either in reviews or on their website. Of course an optimal setup wouldnt do that, but this isnt a fundamental flaw with the board that warrants a thread alluding to Gigabyte misleading the public...

Thank you for the useful information. Just to note when I read your post I thought you were trolling too the way its written. Oh well, moving on......welcome my man! :)
 
Very much Thanks EartDog ! for welcome me. :)
I would guess my writings seams strange sometimes, because
1 am a From Sweden
2 I am dyslectic .
A bad combo when writing in English.

I do agree that alot of people that not are gamers will find it less of a hold back, but they are a big customer group and many of them is the gone be the first ones that will buy the new ssd sata3 disk when they first gone come.

Funny thing, I am not a gamer, and I do not OC(okey on slower set ups like a 2160 set up but not on 1156, need cold whisper quite set up) but alot of gamers and oc use high end system and have alot of knowledge, so good people to listen to and look at what they are buy( like the best mouses there is, is Razer brand)


Are you really sure it only will be only 1- 2% difference and only when using like 5870? Have seen test that shows everything between 1 to maybe 15 % difference on a single card(much slower cards then 5870) setup(difference will be less using sli or crossfire)
The difference is much related to game/app and resolution as I have understand it, but maybe I wrong about it.

This is one showing about 7 % difference, which in my book is quite much off a loss.
http://www.tomshardware.co.uk/PCI-Express-2.0-Crossfire,review-29948-3.html
here is another one showing very small to quite big difference.
http://www.tomshardware.com/reviews/pci-express-scaling-analysis,1572-8.html

Yes you can find this on the spec pages of the cards and in the manual, but it still don´t show that you will loss BW for your GF in evry possible set up using enabling USB3 or SATA ,that you have to know how to do the math to understands, which must people don´t.

And still think it would have been more honest by GB if being more up front about this issue, and not hide it away, so you have to read the manual to understand the whole crippled x16 you will get.
Much because the majority that will buy this cards in the start will be gamers.
And it´always better having people that bought something to know the bad and the good about a product when buying it, then finding out after.
Finding out after only makes sad/angry customers.

And basically that is why I posted about it.
 
Here is some testing done between the P55 and X58 platforms done internally: http://www.ocforums.com/showthread.php?t=619246

Your first Tom's link is on PCIE1.1 and nearly 2 years old. The second article is even older (2007) and is also PCE1.1. THese are not a valid comparisons at all. I would look up something that was completed on the P55 chipset to see real results.

When this board is released, Im sure plenty of gamers will have read the reviews. They should also know that it barely matters.

Again, great information but, it really doesnt matter much as proven by the link I provided.

EDIT: Here is a link to a summary page that shows what happens to the card on PCIe2.0 16x/8x/4x/1x: http://www.techpowerup.com/reviews/AMD/HD_5870_PCI-Express_Scaling/25.html

1-2%, like I said. :)
 
Last edited:
ahha, pcie 1.1 that I missed, so true, so true.

And thanks for the links.

Now I must say it fells much better that i cancel my first order of the original p55 and for me which is no hard core gamer, only play sometimes and HD video editing ( if looking at stuff that the my computer needs good GF for), the P55A starting to look really good, loosing 1-2 % is for me is nothing, compared to gaining alot on the next gen ssd drives(I some in current gen also)

Once again thank you for clearing it out. :)
 
No problem...
Also, unless you are putting a couple SSD's in Raid0, you wont even been limited by the sataII bandwidth so I wouldnt really make that a deciding factor again, unless you are using a couple SSD's in R0.

:)
 
+ Gigabyte P55 use Foxconn sockets and P55A use Lotes sockets. No reason for my point of view to choose a P55 over P55A.
 
Last edited:
This issue is similar to some of the current P55 boards that claim to be CrossFireX capable even though their second PCI Express 2.0 slot only runs at x4.
 
To translate the post a bit, basically these mobos use the CPU PCIe lanes for the SATA3 and USB3.0 controllers. What that means is if you enable SATA3/USB3.0 the main graphics slot drops from PCIe 2.0 16x to PCIe 2.0 8x, or to put in another way using SATA3/USB3.0 has the same effect as putting a card in the secondary physical PCIe 16x graphics slot.

Sort of....the Gigabyte boards give you two options with USB 3.0...the first will use PCIe lanes from the CPU as described by MadMan007, but there is another option. There is a switch in the BIOS to run the USB 3.0 controller via a PCIe 1.1 lane via the P55 chipset instead, saving all 16 precious PCIe 2.0 lanes for graphics. This does limit the USB 3.0 performance somewhat...so it is a bit of give and take, but at least Gigabyte allows you the option.

Watch for my USB 3.0 review posted a bit later today :thup:

edit: my review is posted...http://www.xtremesystems.org/forums/showthread.php?p=4128308#post4128308
 
Last edited:
+ Gigabyte P55 use Foxconn sockets and P55A use Lotes sockets. No reason for my point of view to choose a P55 over P55A.

That's what I would like to confirm after having a nasty experience with a FoxConn Socket on a P55-GD65 MSI board. Does the new Gigabyte P55A's exclusively use Lotes Sockets?
 
That's what I would like to confirm after having a nasty experience with a FoxConn Socket on a P55-GD65 MSI board. Does the new Gigabyte P55A's exclusively use Lotes Sockets?

Would like to know that too...
 
a bit of PCIE 2.0 break down for ya, since i didnt see anyone touch on it...

PCIE 2.0 to 1.0 Lane Width:
PCIE 2.0 x16 = PCIE 1.0 x32
PCIE 2.0 X8 = PCIE 1.0 X16
PCIE 2.0 X4 = PCIE 1.0 X8
PCIE 2.0 X2 = PCIE 1.0 X4 (though i have never seen a 2x slot)
PCIE 2.0 X1 = PCIE 1.0 X2

PCIE 2.0 has double the bandwith of PCIE 1.0...

Just about any current GPU is not using much over PCIE 1.0 x8 bandwidth. the only exceptions to the rule are Dual GPU based video cards.
 
Back