Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed

Outline · [ Standard ] · Linear+

 MSI Gaming Notebook V.II, The Refreshed model yet more powerful!

views
     
Areas Elysian
post Jan 6 2014, 05:42 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
I keep saying I should do it but I never did it..

But the FullHD version of the GE40 is seriously leaps and bounds better. The viewing angles is amazing. Not that anyone would ever view their screen from those odd angles. LOL.
Areas Elysian
post Jan 10 2014, 04:08 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(kyu85 @ Jan 9 2014, 08:06 PM)
any GS70 users here? might b getting 1 either GS70 raid or GE60 raid... any feedback?
*
GF has a GS70.

What kind of information are you looking for?

GS70, as it is, is a really good machine for the weight. 2.6kg but yet 17". That's only 0.2kg heavier but providing a larger screen.

Only issue is, being designed around "slim", it has a larger than normal footprint, finding bags for her was hard. Even my bag which is rated for 17" (can fit my dad's 17" HP inside easily), can't fit the GS70. MSI of course provides their default bag which fits it perfectly.
Areas Elysian
post Mar 21 2014, 04:50 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(storm88 @ Mar 21 2014, 04:49 PM)
you get my meaning wrong ady

many ppl kept on claiming that GTX860M has same performance as GTX770M, and i just telling the one that has similar performance is a GTX770M with limited bandwidth.
*
mSATA easily user upgradable? Or is it hidden etc?
Areas Elysian
post Mar 21 2014, 05:01 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(storm88 @ Mar 21 2014, 04:54 PM)
u mean wnich model?
*
GS60

It's ok. I've figured it out.

The mSATA modules is on the upper side of the PCB board, so you'll need to remove the entire board to change the mSATA.

Secondly, it does not use the standard mSATA connector. Instead, it uses the M6e connector, the same ones used on Macs.
Areas Elysian
post Mar 21 2014, 05:17 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(ThisIsBoletaria @ Mar 21 2014, 05:08 PM)
That's somewhat annoying.  Any idea if it's possible to get those through retail channels?
*
C-Zone is selling the Plextor M6e.

Down side is, it's much more expensive compared to the Plextor M5M mSATA.

66% more for the 128GB Variant
48% more for the 256GB Variant

On the plus side, the read and write is also much faster.. but not sure how accurate is the information on their price list.

Source


Edit:
I did additional further reading.

The connector format is known as the "M.2".


Further Edit:
Looks like the speed boost is correct and confirmed. It's due to the increased in bandwidth from moving to the M.2 interface over SATA 3.

This post has been edited by Areas Elysian: Mar 21 2014, 05:34 PM
Areas Elysian
post Mar 21 2014, 05:44 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(ThisIsBoletaria @ Mar 21 2014, 05:42 PM)
Oh. so that's an M2 connector.  Well, that's better than it being a proprietary format, since the M2 format is supposed to be the new standard going forward.
*
It is the new standard. So we'll just have to fork out extra for being first in line to be using them.
Areas Elysian
post Mar 21 2014, 05:47 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(shin2l @ Mar 21 2014, 05:45 PM)
Nevertheless, no matter which 860M is based on, both should have similar performance with 770M. But I wonder why Nvidia want to release 2 different 860M..... sweat.gif
*
Part 1:
Same performance with regards to horse power, but not the same with regards to power consumption. sad.gif
With only a 55 Wh battery.. every little bit of efficiency helps....

Part 2:
A good way to get rid of excess/left over 770M chips!
Areas Elysian
post Mar 21 2014, 06:21 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(storm88 @ Mar 21 2014, 06:19 PM)
but base onthe info i dig from newegg, it is mSATA neh
*
Yeah, it has been advertised as mSATA, I have no idea why.

Check the video review that was posted earlier. He did a tear down of it and you can clearly see the SSD is not mSATA but M2.
Areas Elysian
post Mar 21 2014, 06:37 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
A link to the part where you can see the M.2 SSD.

https://www.youtube.com/watch?v=raFVNWVkDbE...etailpage#t=739

It's the looonnngg mini card.

Edit:

For comparison. Top left = mSATA, Right = M2, Bottom = SATA.

You can see how the M2 SSD matches the looks of the SSD found in the GS60 as shown in the video link above.

user posted image

This post has been edited by Areas Elysian: Mar 21 2014, 06:57 PM
Areas Elysian
post Mar 27 2014, 10:17 AM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(stringfellow @ Mar 27 2014, 04:24 AM)
Ran Crysis 3, Battlefield 4 and Titanfall on maxed out settings, AA toned down to FXAA. Crysis 3 ran 45fps, BF4 55fps, and Titanfall a rock solid 60fps without wavering!

Temperature climbed up to 71C before arrested by the jet-engine-like cooling fans that came on auto.

Did not play long. I'm stuck in shitty Jeddah and the hotel I'm stuck in has spotty connection. Can even ran the games long without it disconnecting from Steam and Origin.
*
Offtopic.

And that is why I refuse to buy those stupid games which requires online DRM. =\
Areas Elysian
post Mar 27 2014, 01:50 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(koopa @ Mar 27 2014, 01:15 PM)
Hmm, now I'm worried my GS70 will overheat even faster.
*
Just get a nice cooler and use it.
Areas Elysian
post Mar 28 2014, 04:27 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(ReverseDark @ Mar 28 2014, 04:00 PM)
as per you requested, IMO its quite thin in my book, better than all my previous owned laptops, eg asus, dell, acer, toshiba
user posted image
l took the shoot in nandos, table is not mine  drool.gif anyway hoping for gd news from msi, if they entertain you slowly giv me a mes, l will go make a fuss on thier HQ  tongue.gif as well as the headset and backpacks  notworthy.gif
*
You better mention that it is not the above picture, but the earlier one with the wooden table.

Don't want to get Nandos in trouble.
Areas Elysian
post Apr 2 2014, 09:59 AM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(MSI-NB @ Apr 1 2014, 03:53 PM)
user posted image

Date: 4th-6th April 2014
Time: 11am-9pm
Venue: KLCC Convention Centre
Booth no: 104

Come visit us this friday at PIKOM PC Fair rclxms.gif Join in the fun and check out the latest MSI gaming laptop!
*
When will the GS60 be released? Is it out yet?
Areas Elysian
post Apr 3 2014, 06:22 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(stringfellow @ Apr 3 2014, 02:41 PM)
With the 880M? Slideshow. tongue.gif

With Tri-SLI Titans? Depending on games and levels of maxed out settings, from 45-110fps.

With Crossfire R290X? Depending on games and levels of maxed out settings, from 35-75fps.

At 4K resolution, 3840x2160, you dont need AA. Not even FXAA. At that res, pixels are so small, they blend into each other giving you free anti-aliasing.
*
I would still rather have some kind of AA, even if it's only 2x. It does make a difference. No matter how many PPI there are, with AA things will still always look better.
Areas Elysian
post Apr 3 2014, 07:13 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(stringfellow @ Apr 3 2014, 06:53 PM)
To each his own. I dont need AA, I channel those processing power more into getting more framerate than worry about jaggies at the size of a miniscule 4K pixel. 4K on 55"/65"/85" UltraHD TV is much apparent pixels-wise than pixels on a 32" or 24" 4K monitor. I speak from experience, what's yours?
*
Depends on what you're playing of course, and you're right, to each their own. Some people swears by their Denons, while others, are happy with their IEM Beats... (yuck)..

If you're playing FPS or MMO where precision and timing matters, I'll sacrifice visuals for performance.

But for games like Skyrim where the visuals are part of the immersion and game play, the jaggies gets to me.

I speak from experience too.

Running SLi 780GTX on a Dell 28" @ 3840 x 2160.

This post has been edited by Areas Elysian: Apr 3 2014, 07:15 PM
Areas Elysian
post Apr 3 2014, 07:43 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(stringfellow @ Apr 3 2014, 07:29 PM)
Again to each his own. You're probably happy with running 4K at 30Hz on that 28" TN panel Dell of yours, but requirements at 4K30 isn't as stringent as bumping it up to 4K60. Turning AA on at 4K60 would halve what little framerate I'd get at 4K60. I'm more picky of image quality because I went for the IPS (Dell UP2414Q) and IGZO (Asus PQ321Q) panels than TN panels. The only reason folks go for TN panels would be for the higher refresh rates it is capable of, but not in the case of 4K since you're stuck with 30Hz on that Dell of yours.

Let's see, 4K at lower framerate but cleaner edges, versus 4K at higher framerate with supposed jaggier edges, which personally to me, is there are no jagged edges thanks to smaller pixels/higher pixel density. I'll take the latter thanks, no questions.

Pardon the sidetracking guys, this is after all an MSI notebook thread.
*
Just to steer it back a little to MSI, I connect my MSI to these monitors. tongue.gif

I am also picky for image quality. I'm running three monitors setup, 2x27" U2713HM. I actually had a 3rd 27" U2711 which died recently and is out of warranty, so I picked up the 28" TN as temporary as Dell does not have a 27/28" 4K with IPS yet and it's cheap. I game on my 27" still mainly, the 28" was only used for testing and hence, I know about the jaggies and all. Will replace the TN with IPS when that model comes out.

Edit:
p/s, while it may be side tracking, it is still relevant to an extend for people to decide if they feel the need to go for a 4K or not etc.

Edit 2:
Remember the main reason for this discussion, at 4K, jaggies are still noticeable to some, especially so for things such as hanging telephone/electrical wires/cables etc, those looks horrible without AA. Again, depending on the game and scenes. My pet peeve I suppose. Sucks for me.

This post has been edited by Areas Elysian: Apr 3 2014, 07:54 PM
Areas Elysian
post Apr 3 2014, 09:11 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(stringfellow @ Apr 3 2014, 07:56 PM)
And if these monitors are your daily drivers, and not the 4K monitors, it goes back to the experience long term use of the 4K monitor to gauge whether anti-aliasing is important or even necessary on a 3840x2160 screen at a 24"size. Folks who are used to gaming at 1920x1080 or 2560x1440 finds it necessary to turn on AA because pixel density is less on those, pixel size is bigger on those, making stair-stepping jaggies more apparent. Not so much on pixels on a 4K monitor sized at 24", or 32" for that matter.

At 1080p or 1440p, running at presently much more affordable GTX 780 variants of the card, you have ample enough horse power to turn on AA without dropping framerates below whatever standards you're okay with (some are okay with 30, other set theirs at 40fps or more). Not at 4K.

P/S: I'm sorry guys, I tried steering back the conversation to our laptops.
*
I suppose.

I tried the 4k for like a few hours and didn't like it somehow. I had to turn off AA due to the lag (yes.. Skyrim with full mods can bring dual 780GTX to a cry on 4K with AA).. and wasn't happy with the jaggies.. I somehow keep noticing it.. not as a primary driver.. gotta wait for dell to come up with their 27/28" 4K IPS monitor.

Areas Elysian
post Apr 4 2014, 05:47 PM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(Azureknight94 @ Apr 4 2014, 05:41 PM)
Hi am using a Ge40 and everything seems to be great except that my wifi seems to drop randomly. Its the realtek wifi card and I tried googling around and it seems there are a few others with the same problem. Couldn't find any solutions to the problem though. Any ideas? I tried looking for the drivers too but couldn't find one sweat.gif
*
According to my "Device Manager", mine's Realtek RTL8723AE Wireless LAN 802.11n PCI-E NIC.

Is that the same as yours?
Areas Elysian
post Apr 24 2014, 08:53 AM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
To all GS60 users here, how are you guys finding the track pad with regards to sensitivity for small precise movements? (Clickpad)

To all GS70 users (I am assuming they're using the same Clickpad), how are you guys finding it with regards to the sensitivity for small precise movements?

Secondly, anyone with any solutions to the issue?

I am finding the GS60's Clickpad unusable as it is currently. I am finding too many accidental edge swipes and movements from the one piece design.. not to mention sensitivity for small slow & precise movements shit.. aka trying to slowly click on a small check box etc.
Areas Elysian
post Apr 25 2014, 09:51 AM

Regular
******
Senior Member
1,132 posts

Joined: Oct 2010
QUOTE(kyu85 @ Apr 24 2014, 06:17 PM)
have been using mouse since 1st day brought it... i think the msi mousepad is quite well known to have this problem...
*
I'm comparing with the GE40 which of course has a dedicated click button, so that helps a lot. But looks like the sensitivity took a hit when they made the pad larger. I had no issues with the sensitivity on the GE40.

3 Pages < 1 2 3 >Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0356sec    0.65    7 queries    GZIP Disabled
Time is now: 4th December 2025 - 12:24 PM