Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
6 Pages « < 2 3 4 5 6 >Bottom

Outline · [ Standard ] · Linear+

 Super Noobs Help! [Thread V2], Post Your ULTRA Simple Questions Here

views
     
amduser
post Feb 18 2008, 07:17 PM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


yes......crossfire can support different manufacture and different card(not very sure) and also different clock
amduser
post Feb 22 2008, 10:48 PM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(ngwinnie @ Feb 22 2008, 08:46 PM)
how important is the voltage regulator? is it safer when there's lightning strike?

computer, hp charger, light & all other electrical items in the room sharing one plugpoint, is it safer to have AVR?

how to choose an AVR?

will the price of LCD monitors reduce after sometime (like other pc hardware)?
*
dont know much about the voltage regulator, but i think it can prevent power surge through ur PC and other electrical device, when lightning want to strike ur PC there's not much u can do but to prevent it, like unplug all the plug connect to the main power sauce, voltage regulator and other device sometimes may fail u....

do u know how much is the price when the LCD first introduce to the public?? it cost more than 1k for a 17" LCD which now only cost about 5k, when i brought my 17" LCD it cost me 600-700
amduser
post Feb 23 2008, 08:44 AM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(speedguy10 @ Feb 23 2008, 12:17 AM)
Want to know something about RAM. However, i m pretty sure this question had been asked many times, i do hope i can get some expert to spare a moment to give brief guide for me.

What are the differences between value ram, performance ram and gaming ram?

For my current understanding, I just know the latency between this 3 ram are:

value ram > gaming ram > performance ram.

Would it be considered as waste if a user buy a performance/gaming ram without overclocking it?
*
basically, performance RAM runs faster than value RAM and performance hv better overclockability.

i've never heard gaming RAM be4, i think the gaming RAM u mention is the same as performance RAM.
amduser
post Feb 24 2008, 11:04 AM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(wen9x88 @ Feb 23 2008, 12:11 PM)
what casing inlcude a good psu for gaming?
because ninja casing + st56f quite expensive...
*
ST56F already cost about RM350, so u think wat will the price be when it is come with casing.

u need to depends on ur hardware component and choose which PSU lo, normally 450W true power above will be suitable, for 8800GT or higher end grpahic card may need 500W true power to run(without SLI setup)

QUOTE(wen9x88 @ Feb 23 2008, 01:04 PM)
next month i get my whole rig...
because next month i get RM4000,and pc fair launch and now 9600GT launch....lucky got time to see the 9600GT user comment....and learn more.
*
9600GT? a mid range card, performance below 8800GT, replacement of 8600GT but perform better than 8600GT, cost about RM600 now.

QUOTE(alwizbthere @ Feb 23 2008, 02:24 PM)
wat's pc fair got to with when u are buying pc?
not likely to get any discount at pc fair for full system specs, heck i'm sceptical of pc fair hardware anyway, some shops use it to throw old stocks
*
i felt that PC fair hv many fisherman like to catch water fish.

QUOTE(wen9x88 @ Feb 23 2008, 02:36 PM)
samsung monitor and zotac 8800GT graphic card....
anyway now i don have money to buy my whole rig...
just got 1.5k....so i just can wait only next month...4k++
*
u can choose to buy 1 by 1, now u gt 1.5k, u can buy a mobo with graphic card first or monitor with graphic card.

QUOTE(soul_star @ Feb 24 2008, 10:41 AM)
erm,what does SLI means and what is the use of SLI?
*
running 2 graphic card at the same time to improve performance, somethings lke 30% performance increase(only for high-end card, low end/mid-end i think no ppl will run SLI)
amduser
post Feb 25 2008, 10:44 AM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(mr.monkeykingbar @ Feb 25 2008, 02:10 AM)
pro pls tell me..
gaming, E8400 more den enuf. cuz high speed and won probably running alot applications at the same time.
quadcore is for wat use? like E9450
wat kind of programe will fully utilize quadcore potential ? slower but more cache and more core of course.
*
quadcore is better than dual core lar, dont just compare the GHz, but compare others, not all software or program that can fully utilize it, but i think CAD or photoshop type of program maybe can.
amduser
post Feb 25 2008, 10:43 PM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(jejari7 @ Feb 25 2008, 11:41 AM)
Hi, i wanna ask something, is there any such thing to convert agp 8x port to pcie so that i can upgrade my gc to pcie??
*
no....

QUOTE(mr.monkeykingbar @ Feb 25 2008, 12:47 PM)
izzit pciex16 = pice 1.0
and pciex32 = pcie2.0?
wat about 2x16 = 2 pcie1.0 or pcie2.0?
anyway , only x38 board offer pcie2.0 right? p35 only 1.0?
*
dont know, but wat i know is pci-e 2.0 is the upgrade version of pci-e 1.0, 2.0 wider bandwidth than 1.0, theoretically, pci-e 2.0 perform better than pci-e 1.0 but not a noticeable imrpovement, yes, it means it is just slightly improve that we cant actually feel it.
amduser
post Feb 28 2008, 11:52 AM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(sang_karim @ Feb 26 2008, 08:30 PM)
noob question here .... i just bought a western digital 160Gb SataII hdd .. i'm planing to make 2 partition on that hdd (80Gb + 80Gb) ... then i wanna copy my old windows from ide hdd and place it to the first partition WD hdd using acronis (still learning how to use it though  sweat.gif ) .. my question is ... rite now the WD hdd act as secondary hdd ... can i make partition fot that WD hdd?? or i have to make the partition through dos while install new windows like normal practice? help needed  icon_question.gif
*
can, while u format the HDD u make 2 partition, if u want ur new HDD become master/primary then u change the slot.

QUOTE(sca1lywag @ Feb 27 2008, 12:40 AM)
Hi..

I wanna get a new hard disk for my ancient P4x-266 board. Can I get a new IDE Seagate / WD 8M ... 80 Gig or 160 Gig? I am not sure if my mobo can support them..

Tolong!!! Replies will be appreciated.. smile.gif
*
y not?? as long as u hv ide slot and ide cable.

QUOTE(wen9x88 @ Feb 27 2008, 07:14 PM)
can i SLI with one 8800GT and one 9600GT??
*
no, SLI is SLI not crossfire.
amduser
post Feb 28 2008, 10:15 PM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(alwizbthere @ Feb 28 2008, 12:26 PM)
...
wat does wen9x88 question has to do with crossfire pulak????
even crossfire also kenot support cards from different series la....
most of the time u post reasonably, but sometimes crap does spew from ya...ur answer this time is totally unrelated...although his question was crap also....
*
CrossFire was first made available to the public on September 27, 2005.

QUOTE
The system required a CrossFire-compliant motherboard with a pair of ATI Radeon PCI Express (PCIe) graphics cards, which can be enabled via either hardware or software. Radeon x800s, x850s, x1800s and x1900s came in a regular edition, and a 'CrossFire Edition' which has 'master' capability built into the hardware, provided by 5 extra compositing chips[clarify. please kindly share what those chips are, and what are they for One must have bought a Master card, and paired it with a regular card from the same series. The Master card would have shipped with a proprietary DVI Y-dongle, which would plug into the primary DVI ports from both cards, and into the monitor cable. This dongle serves as the main link between both cards, and the output to the monitor. Radeon x1300s and x1600s have no 'CrossFire Edition' but are enabled via software, with communication forwarded via PCI Express. ATI currently has not created the infrastructure to allow FireGL cards to be set up in a CrossFire configuration. The 'slave' graphics card needed to be from the same family as the 'master', regardless of whether the 'master' is designated by the hardware or by software.

An example of a limitation in regard to a Master-card configuration would be the first-generation CrossFire implementation in the Radeon X850 XT Master Card. Because it used a compositing chip from Silicon Image (SiI 163B TMDS), the maximum resolution on an X850 CrossFire setup was limited to 1600×1200 at 60 Hz, or 1920×1440 at 52 Hz. This was considered a problem for CRT owners wishing to use CrossFire to play games at high resolutions, or owners of Widescreen LCD monitors. As many people found a 60 Hz refresh rate with a CRT to strain ones eyes, the practical resolution limit became 1280×1024, which did not push CrossFire enough to justify the cost.[2] The next generation of CrossFire, as employed by the X1800 Master cards, used two sets of compositing chips and a custom dual-link DVI Y-dongle to double the bandwidth, raising the maximum resolution and refresh rate to far higher levels.


QUOTE
When used with ATI's "CrossFire Xpress 3200" motherboard chipset, the 'master' card is no longer required for every "CrossFire Ready" card (with the exception of the Radeon X1900 series). With the CrossFire Xpress 3200, two normal cards can be run in a Crossfire setup, using the PCI-E bus for communications. While performance was impacted, this move was viewed as an overall improvement in market strategy, due to the fact that Crossfire Master cards were expensive, in very high demand, and largely unavailable at the retail level.


Although the CrossFire Xpress 3200 chipset is indeed capable of CrossFire through the PCI-e bus for every Radeon series below the X1900s, the driver accommodations for this CrossFire method has not yet materialized for the X1800 series. ATI has said that future revisions of the Catalyst driver suite will contain what is required for X1800 dongleless CrossFire, but has not yet mentioned a specific date.


QUOTE
With the release of the Radeon X1950 Pro (RV570 GPU), ATI has completely revised CrossFire's connection infrastructure to further eliminate the need for past Y-dongle/Master card and slave card configurations for CrossFire to operate. ATI's CrossFire connector is now a ribbon-like connector attached to the top of each graphics adapter, similar to nVidia's SLi bridges, but different in physical and logical natures. As such, Master Cards no longer exist, and are not required for maximum performance. Two dongles can be used per card; these were put to full use with the release of CrossFire X. Radeon HD 2900 and HD 3000 series cards use the same ribbon connectors, but the HD 3800 series of cards only require one ribbon connector, to facilitate CrossFire X.[4] Unlike older series of Radeon cards, different HD 3800 series cards can be combined in CrossFire, each with separate clock control.

Since the release of the codenamed Spider desktop platform from AMD on November 19, 2007, the CrossFire setup has been updated with support for a maximum of four video cards with the 790FX chipset; the CrossFire branding was then changed to "ATI CrossFire X". The setup, according to internal testing by AMD, will bring at least 3.2x performance increase in several games and applications which required massive graphics capabilities of the computer system, the setup is targeted to the enthusiast market. Later developments include a dual GPU solution to be released early 2008, the "ATI Radeon HD 3870 X2", featuring only one CrossFire connector for dual card, four GPU scalability.

wiki wiki wiki wiki wiki wiki wiki wiki wiki wiki wiki wiki wiki wiki wiki

hmm.gif maybe my analysis is wrong??


Added on February 28, 2008, 10:15 pm
QUOTE(wen9x88 @ Feb 28 2008, 06:31 PM)
when the nvidia launch 9800GT ??
*
i hv heard of 9800GX2, but i didnt heard of 9800GT, 9800GX2 will be launch in mid of march i think.

This post has been edited by amduser: Feb 28 2008, 10:15 PM
amduser
post Mar 2 2008, 02:51 PM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(Prodigenous Zee @ Mar 2 2008, 01:22 PM)
Question here, if a mother can support DDR3 RAM, can it also support DDR2 RAM?
*
no, unless ur motherboard support DDR2 and DDR3
amduser
post Mar 2 2008, 07:53 PM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(Prodigenous Zee @ Mar 2 2008, 03:00 PM)
I see, thanks. Can show me a mobo with both DDR2 and DDR3 RAM support?
*
user posted image
amduser
post Mar 6 2008, 11:18 AM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(DeathChaosX @ Mar 6 2008, 07:32 AM)
hey guys, is there any way i can check when my pc restart before and whats the cause?

because i on my pc overnite then when i woke up, i saw the dl closed and the pc shows sign of restart before.
*
u can try to see the event viewer in control panel > administrative tools >event viewer
amduser
post Mar 6 2008, 10:46 PM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(DeathChaosX @ Mar 6 2008, 08:07 PM)
thanks for the reply.
i got another question,
is biostar R690A-M2T=biostar TA690G??

rclxub.gif  rclxub.gif  rclxub.gif

Edit: Also, whats the difference between directx9 and directx10? when i play lost planet on directx10 default settings i get an fps of around 20 only but when i play on directx9 i get around 50.
*
biostar R690A-M2T?? sry, never heard of it be4, but i think that's 2 different board.

directX 10 gives better image quality compare to directX 9, but some low end graphic card like 8600GT might be suffering to gv u above 30 FPS, depends on the games lar, but there r not much games that support DX10
amduser
post Mar 6 2008, 11:16 PM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(DeathChaosX @ Mar 6 2008, 11:09 PM)
i also not sure la, but i think its the same board. bcoz i just checked my mobo box and found out it got another name which is R690AM2T at the barcode there.

But the top of the box main model name is TA690G, i think the R690A-M2T is bcoz of its made in china
*
i think that is not the model name but somethings they use to identify the board, i also not sure, but i dont think that's important for us.
amduser
post Mar 9 2008, 10:24 PM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(skyzero @ Mar 7 2008, 05:53 PM)
ok other question..what is means when said the psu is 550W...can peak to 600W??
*
can, but this will shorten the PSU life span and it will stressing the PSU.

QUOTE(ks_kore @ Mar 8 2008, 03:32 AM)
got some doubt here...if i replace my mobo with a new but same with my old mobo, did i need to reinstall windows when connect the harddisk to the new mobo?
*
no need as long as it is the same model.

QUOTE(wen9x88 @ Mar 8 2008, 12:26 PM)
Processor:AMD AthlonX2 6000+ 3.0Ghz =Rm535
Mobo:Biostar Tforce 690G
Ram:Kingston 1G 667Mhz ddr2 x4 =Rm260 (Each65)
HDD:WD 320G sata 16mb =Rm260
GC:Sapphire 3870 512mb ddr4 =RM800
SoundCard:Creative X-fi Extreme audio =RM245
Speaker:Atlec Lansing V4121 =RM235
Rom:Pioneer 18X DVD-RW sata =RM120
LCD:Samsung 22'' =RM1000
Psu : Silverstone ST50F 500w
Casing Fans:Add on some casing fans:RM50
Keyboard&Mouse:A4Tech set=Rm80
yesterday i go to lowyat plaza.
and the viewnet staff say take a good motherboard and psu.
because i open 24/7 so he ask me to take a good motherboard and 600Watt psu,and add more HDD also no problem.
but i think i just use 1-2 HDD only.

this three item not confirm....

Silverstone ST50F 500w or 600W psu ?
Biostar Tforce 690G or better motherboard ?
and he say XFX 8800GT 512MB extreme is the best or zotac 8800GT 512MB AMP version ?
*
500W true power should be enough, if not then choose the ST56F, quite nice.
amduser
post Mar 10 2008, 12:57 PM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(wen9x88 @ Mar 9 2008, 11:09 PM)
wow i don know the silverstone sf50f is true power anot?
anyone know this psu is truepower anot ?
*
of coz lar, if not we wont recommend u to choose it.

QUOTE(wen9x88 @ Mar 9 2008, 11:37 PM)
i don get what u mean...
that guy say the Biostar Tforce 780 is 261% faster than current 690 chipset.
this is true anot ?
*
QUOTE(wen9x88 @ Mar 10 2008, 12:16 AM)
use it ?
u mean the motherboard Biostar Tforce 780 and 690?
now i take Biostar Tforce 690.
if the Biostar Tforce 780 really better than 690 261%。
then i take Biostar Tforce 780.
*
most of the motherboard r same, and it wont really effect the benchmark score of the graphic card, every motherboard(cheap or expensive) will do their job very well, as a interface to connect all hardware component together. some motherboard good in oc is becoz they hv some other extra/unique features that is good in overclocking or watever.

just like u buy handphone lar, more expensive more function lo, but they still can make a call.

This post has been edited by amduser: Mar 10 2008, 12:58 PM
amduser
post Mar 11 2008, 10:12 AM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(wen9x88 @ Mar 10 2008, 01:00 PM)
oic...because that ppl say it faster than 690.
this is make me confuse,now i take Biostar Tforce 690.
i don have oc,i just open 24/7.
rclxub.gif  rclxub.gif  rclxub.gif  rclxub.gif  rclxub.gif  rclxub.gif  rclxub.gif

this guy say:
From what I know this chipset is 261% faster than current 690 chipset. And it support hybrid crossfire with 3400 range GC..
Here current benchmark on these chipset.
AMD ATHLONX2 5000+, 2GB DDR2-800 Corsair, Vista.
3DMARK05 @690G=1105, @RS780=2495
Vista Experience Index @690G=3.1, @RS780=3.5
FEAR DEMO @690=15FPS, @RS780=29FPS
3DMARK 06 @690G : score=321, SM2:147, HDR/SM3=0, CPU score = 1860
3DMARK 06 @RS780: Score=1162, SM2:380, HDR/SM3=441, CPU score 1869
*
i think that's the onboard graphic, u dont need to care about it if u're using add-on graphic, if u dont oc, any motherboard u also can use, depends on ur budget, motherboard wont really effect the performance.

QUOTE(tech_frix @ Mar 10 2008, 04:50 PM)
stop repeating d 261% thingy...
if u intend to use onboard gc, then buy d 261% performance boost mobo..
for me better go intel rather than amd...dis is just my humble opinion.. blush.gif
*
mind to explain y choose intel rather than amd?

QUOTE(DeathChaosX @ Mar 11 2008, 12:09 AM)
Higher memory size of GC better than lower memory size in what terms?

Edit: Also, if my msi gc is originally installed with msi driver, can i install the ati driver downloaded from the ati website? No difference right?
*
the msi driver that come with the graphic card is ati driver also, u can just install the latest driver, no problem.
amduser
post Mar 12 2008, 09:22 AM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(DeathChaosX @ Mar 11 2008, 06:20 PM)
Anyone can explain to me whats the difference between higher memory size and lower memory size in a graphic card?
*
if u want simple explaination then higher memory size will run faster than lower memory size

the function of the memory is same as the system RAM, but the graphic RAM will only be use by the graphic card itself
amduser
post Mar 12 2008, 07:19 PM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(xen0 @ Mar 12 2008, 05:23 PM)
in HDD case..higher cache is better? in wut term?

coz i hv 2 hdd..320Gb 16mb and 160Gb 8mb..

which one is good for NLE and video compositing activities??
video editing access HDD a lot..


Added on March 12, 2008, 5:33 pm

isnt it GC memory size is more to resolution issue?? unsure.gif
*
if u want a fast HDD then u can go for raptor, i dont think cache is important, if video composing access HDD a lot then u should choose a faster speed HDD.

about the GC memory size, me also not clear about that, but i normally compare gc through benchmark score.
amduser
post Mar 15 2008, 08:15 PM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(fense @ Mar 15 2008, 05:52 PM)
may i Know that:
my laptop using Kingmax DDR400 PC3200 512MB.

can i mix it with another DDR333 1GB?? what will be the possible outcome?

thanks.
*
1.5GB with the speed same with DDR333
amduser
post Mar 16 2008, 10:33 PM

Look at all my stars!!
*******
Senior Member
5,542 posts

Joined: Dec 2006


QUOTE(fense @ Mar 16 2008, 02:02 PM)
my laptop was pentium M, i read from intel website, the maximum supported was 333 only, right?
*
dont know, but if the website say so, then it is, if u slot in DDR400 i think it will run at the speed of DDR333

6 Pages « < 2 3 4 5 6 >Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0434sec    0.28    7 queries    GZIP Disabled
Time is now: 16th December 2025 - 03:32 PM