Welcome Guest ( Log In | Register )

18 Pages « < 6 7 8 9 10 > » Bottom

Outline · [ Standard ] · Linear+

 Your Home Theater Setup.. v2, Let's share..

views
     
SSJBen
post Dec 28 2016, 03:27 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Skylinestar @ Dec 28 2016, 12:14 PM)
the main reason is low noise floor. XLR balanced connection is very useful in long wire runs. unfortunately, unlike pro audio, not all consumer AV items have full balanced connection. selling XLR balanced connection with unbalanced signalling is a joke.
it's a deep rabbit hole.
*
This is correct. Many pre/pros don't actually have fully balanced connections from their XLR outputs. Often times, it's only the L/R channels which are balanced and perhaps the Center in some scenarios. In this instance, using RCA can actually provide a lower noise floor than an XLR cable.

Fully balanced pre/pros for ALL channels are expensive and also a dying breed unfortunately.
SSJBen
post Dec 30 2016, 11:03 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(saitong09 @ Dec 30 2016, 06:31 PM)
Tested UDP-203 for mkv playback on audio bitstream
1. DTSMA - ok
2. DTS-X - ok
3. TrueHD - some can bit stream, some MLP
4. Atmos - MLP only

Seem like Oppo have some compatibility issue with Dolby sad.gif
*
Does this happen to ordinary DD 640kbps or DD+ also?
SSJBen
post Dec 31 2016, 10:14 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(jamesleetech @ Dec 31 2016, 03:10 PM)
You said Dune Solo 4K do NOT support HDR and Dolby Vision. What is confusing for me is... the player support is needed.

A lot of things that I am saying now is blur blur and ignorant so please be patient with me. I can be wrong here.

I may have missed it but I have Googled everywhere and couldn't find any info on whether HDR and Dolby Vision need 4K bluray player to support it.

I do know that bitstreaming audio out to AVR do NOT need bluray player to support True-Hd/DTS/Atmos/DTSX/Auro-3D because the decoding is not done by the player. Only when player is not set to bitstream, then the player decodes the audio into PCM.

Here is what I believe... there is no player that decodes HDR and Dolby Vision because its not necessary. Such work is done by the visual display decoder, i.e TV/Projector.

What I also believe is that the HDR & Dolby Vision data in the 4K bluray is "merely" read and bitstreamed out to the TV to decode the HDR & DVision. What the player can probably do is either block (filter) out the HDR & DVision data or allow it to pass through. If true, I think such "On/Off" filter is needed because such "additional data" may cause Non-HDR/DVision TV to output incorrect dynamic range, colour, etc.

So, simply said... isn't HDR and Dolby Vision all about decoding support from the video device (TV/PRojector/Phone) and not about players?

Don't need source 4K Bluray/Media Player to support HDR & DVision as it just "passes" it out... OR is it like what you said... need the Player (Dune) to support it.

Probably the player support is needed because it need to be able to recognise it and if not, then such data will be "ignored" and lost when video is sent out. Is this the reason why support is needed?

blush.gif  confused.gif  rclxub.gif

If anyone knows a clear answer to this, PLEASE do chime in and correct me. I need to learn more here. Will certainly appreciate it.
*
As long as the display is capable of receiving HDR10 metadata, then it can display it (keyword is receiving, whether the display can even do HDR properly or not is another topic). The receiver/player needs to be set to passthrough (or anything equivalent) and no processing can be done.

Dolby Vision is a little more complicated. The reason the majority of TVs and players don't support it is more down due to licensing fees. Why pay when you have free? Dolby Vision is better, no questions asked. But when you have 99% of the movie-buffs not even being able to tell the difference between an 8bit and 10bit panel, then what's the point in spending the effort for the speck of minority who cares?

There is "filter" needed. If a display has no capabilities of receiving HDR metadata, it simply won't display it. Plain and simple. No such issues as wrong color range.

I however, cannot answer why some players can passthrough HDR and some cannot. It's not exactly HDCP or HDMI related, as even something as old as the PS4 which consists of HDCP 1.2 and HDMI 1.4 can passthrough HDR without issues.

This post has been edited by SSJBen: Dec 31 2016, 10:14 PM
SSJBen
post Jan 3 2017, 05:08 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(jamesleetech @ Jan 3 2017, 03:18 PM)
Thanks for your reply. Appreciated it.

You said "The receiver/player needs to be set to passthrough (or anything equivalent)". This is where I did understand much earlier. The metadata being "relayed out" without being touched by the AVR/Player so its the TV/Projector display that processes the HDR10 metadata. Yes, whether HDR can be properly processed or not is an ongoing debate. I should also say that I heard that the TV/Projector must also be calibrated (ISV?) properly and if not done properly, HDR can make it worse, not better.

My question was not about the complexity of Dolby Vision or possibly HDR. Its about the question on why any player must be compatible to just passthrough the HDR signal out to the display. I could not find the answer to this and you also cannot answer too when you replied "can't answer why some players can passthrough HDR and some can't" if player compatibility is not required.

IF, again IF, player compatibility is required, then it should also mean the same thing for the HT AVR/Pre-Amp when its connected to the player. Correct me if I am wrong... I haven't heard and did not find anywhere that says AVR/Pre-Amp needed to be compatible. If compatibility is needed then I believe the HDR logo should appear printed on the front body of the AVR/Pre-Amp (similar to Atmos logo) but I have not seen it.

Do correct me if I have misunderstood you... as you have said, non-HDR Ready TVs simply don't do anything to the "additional metadata" received so the picture will not be affected. If that being the case, why should Oppo put in the HDR "on and off" setting into 203 when the HDR metadata do not need to be filtered out to a Non-HDR TV/Projector? Having this HDR on or off did affect the picture as someone in Facebook have already tested. What did Oppo 203 do to the signal when its set to HDR On or HDR Off? By right, the Non-HDR TV/Projector just ignores the metadata so the HDR will not affect the picture... or am I wrong here?
*
Displays that are capable of reproducing proper HDR values (at least above 1000 units on a peak brightness scale of 10% window) follows a different set of calibration values. As for HDR making it worse, yup - just take a look at all those displays that really only has 400 nits in brightness and pretends that's HDR. Edge lit TVs are the worse offenders especially, with haloing and blown-out whites. Unfortunately, majority of people thinks it's awesome because.. "HDR". Lol.

Perhaps not entirely on-topic, just hear me out for a minute. I've did some digging into the HDR compatibility thing via the PS4. Yes I know, it's a game console but it's also a media player at the same time. Thing is, it's related in the fact that it never had HDR capability until just over 2 months ago.
By checking out some of the design prints and schematics, it seems to me that the HDMI controller on the PS4 (a custom Panasonic MN86471A) needs to be able to passthrough HDR in the first place. In this case, because the southbridge of the PS4 encrypts all data into HDCP2.2, all the HDMI controller needs is patching it for support via firmware update (which Sony did).

Now to return to the question at hand: whether or not disc players compatibility is required?
From what I understand, many do not have the support because of various reasons like not having the proper HDMI controller or as it was already infamous enough; the HDMI port not actually being a true HDMI 2.0a port with HDCP 2.2 support.

Remember how in 2015 we had receivers with only 2 or 3 ports being HDMI 2.0a/HDCP 2.2 capable? In 2016, we have countless displays where only ONE HDMI port is capable of receiving HDR metadata, 4k at 4:4:4 at 8-bit, or 4k at 60hz.

All this leads me to believe that the HDMI controllers (and since there are so many of them) are at fault here. To take the PS4 as an example again (since it's like the only mainstream media player that received a firmware to enable itself to be a HDR capable machine), manufacturers can indeed patch HDR support into their receivers/players IF the HDMI controller is capable of handling the data to begin with.

Unfortunately it's impossible to tell which player/receiver has which HDMI controller since there are so many of them. Also the fact that majority of manufacturers refreshes their product cycle every 8-14 months (not everyone is Oppo you know, telling people it'll be ready when it's read), it's economically obvious to many that wasting time on old products where they can instead cash in the feature on next year's product is a wiser choice (unless you have like 45 million units sold on a single SKU).

As for the last question, I don't have an Oppo 203. So... not much comment about what you've said about the player have different results. By right, if the player is connected to a non-HDR capable TV, then the option should simply be greyed out entirely (which is what the PS4s/XboneS are doing).
Can you link me to that FB post where the person tested out HDR on a non-HDR capable display? Sounds to me like the processing is done on the player itself before being sent out to the display (basically like "bitstreaming" the entire video signal over after post-processing).

Good discussion though. thumbup.gif


QUOTE(teop @ Jan 3 2017, 04:01 PM)
I'm no expert here, just writing my thoughts for discussion.

I too think that all that is in the digital domain when unprocessed should comes out identical thru the signal path. But I have my doubts when I tried ripping audio CDs. I found out that what you rip may not be bit-to-bit accurate due to the source CD quality and also the CD reader. And the thing is there is no way of knowing if there is an error. So that kinda tells me that in this path there is lacking of error detection or correction here.

So when I think about it, it sounds logical since when it comes to streaming most of the data is in real-time and hence time sensitive and must be tolerant to errors. Unlike data files where accuracy is more important than timeliness, the system can afford to retry and therefore includes more accurate error detection and correction that will allow the system to ensure accuracy.

So in that sense, it maybe possible that the blue-ray player are allowed to read audio with errors to a certain extend. Same goes to audio bit streaming. If this is true, then it is desirable to have as short as possible path...
*
Which ripping software did you use? Also, what settings? Different rippers has different algorithms and it has objectively been tested that many ripping softwares has not always done the perfect job of a bit-matched rip.

This is the same case with remuxing blu-ray movies also. I've came across some remux rips using DTS-HD as the codec but instead somehow, the bitrate is only 1.5mbps instead of being above 3mbps (the average for DTS-HD files). Sure enough, I took the same disc version and compared them A-B and there's an absolute difference which is noticeable to my ears.

This is what you call variable causes.
But with all else being equal, I still hear no difference.
SSJBen
post Jan 5 2017, 08:01 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Lol soon everyone's receivers are outdated again. Lmao.
SSJBen
post Jan 5 2017, 10:49 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(saitong09 @ Jan 5 2017, 10:43 PM)
If your player have dual HDMI out like Oppo, receiver no need upgrade smile.gif
*
Doesn't work for my setup. Oppo players HDMI passthroughs adds quite a bit of input lag. I do play games, so that's a big no-no.

That said, HDMI 2.1 can be firmware patched into HDMI 2.0 hardwares provided the implementation was following the standard to begin with. So fingers crossed that my receiver gets it. But then again, companies want to milk their products and want to make money, so they probably might not even care.
SSJBen
post Jan 6 2017, 04:36 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(jamesleetech @ Jan 6 2017, 03:11 PM)
I don't follow you. Highlighted as blue text in quote. Mmm... how do playing games cause Oppo players to have HDMI input lag? I don't have any input lag problems with my 105D HDMI 1 to TV and HDMI 2 to my Pre-Processor. You use Zone to stream to another room and play games using the same HT AVR? If I use PC HDMI to my HT Pre-Amp, I don't use my Oppo and vice versa... both are not switched on together.

confused.gif Am I right to assume that you have a complicated connection setup. Sorry for not being able to figure it out.  blush.gif Please enlighten me.
*
You're mistaken James.

My reply to saitong99 was in regards to why an Oppo (or any player with a HDMI passthrough for that matter) does not work for my setup is because I do play games. This all came up with the HDMI 2.1 post, which although at this point nothing uses it and won't for quite a while, is something I'm concerned with in the next 2-3 years.

Because saitong99 said "if your player has dual HDMI out like Oppo receiver no need upgrade", wouldn't that mean having to passthrough all info to the Oppo player then out to the display?
The Oppo HDMI passthrough is not a pure direct passthrough, it adds latency and causes irregular spikes in input lag (which is a lot worse than just having a constant number). That's why for my setup consisting of a gaming PC/HTPC + consoles, that isn't going to work for me.

My current (and always have been) setup is of course the simplest most direct way - all sources to the receiver, then receiver out to the display. Inherently, receivers do add input lag as well. But turning off all the GUI/short message displays, disabling upscaling, all comes down to just under 5ms of additional input lag. So it's something that is mostly not noticeable versus plugging directly to the display.
SSJBen
post Jan 7 2017, 01:09 AM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Skylinestar @ Jan 6 2017, 08:04 PM)
what saitong09 means is:
Oppo output1 > TV (with post HDMI2.1 standard)
Oppo output2 > AVR/Prepro (without HDMI2.1 standard)
...and media playback from the Oppo itself. Nothing to do with Oppo inputs for passthrough.

SSJBen, are you saying Oppo output2 is more laggy than output1? If you're talking about the input, yes, all additional input stages produce lag.
*
If that's what he is saying, then I appologize because I've partly mistaken what he said.

I was under the impression that he's saying all sources except the Oppo player is connected to the receiver. Then receiver to Oppo player then out to the display. By doing this, like I said - it causes noticeable increase in input lag because the signal is being processed by the player. It's not a direct passthrough, this is even with Darbee entirely disabled.

I'm talking strictly about games here, not movies.



QUOTE(jamesleetech @ Jan 6 2017, 11:39 PM)
Your reply makes it even more confusing for me. Sorry for being dumb so please bear with me here.

I will use a graphic to better illustrate what I did understand from Saitong99...

[attachmentid=8369024]

I am repeating here. What Saitong99 earlier said was that the AVR do not need to be updated or upgraded to support HDMI 2.1 since the AVR is used to process the audio only and don't need to be connected to the TV/Projector. Since the video portion from Oppo is directly output to the TV/Projector, all it needs is for the Oppo to be upgraded/replaced (firmware ?) to support HDMI 2.1.

Since you said that you used the simplest direct way of connection of all sources to the AVR, then is your actual connection the same as the graphic illustrated above?

IF its the same connection I described above, then by right your HTPC should have been off when you play the Oppo
since only one TV display is used. Lagging to either your HTPC or Oppo should not affect each other since one of them is off.

Am I missing something here? I don't think you somehow connected your Oppo HDMI 1 out directly to your HTPC which is connected to the TV, and then Oppo HDMI 2 out to your AVR? "IF" you only used your HTPC as HDMI output to your TV, then it will be a problem when HDMI 2.1 isn't supported.

Yes, I can believe that there will be some (unnoticeable) lag issues when both the Oppo two HDMI outs are used. Having your HTPC connected to the AVR should by right not cause Oppo to increase lag because your HTPC is off when Oppo is used.

What I can easily understand is for the Oppo player to be "upgraded" to support HDMI 2.1 without requiring the AVR to be upgraded or replaced.

If you say that both Oppo HDMI 1 and 2 Outputs will be disabled when it detected that either one is connected to a device that don't support HDMI 2.1, well, then I can understand. Example... Oppo HDMI 1 to 1st Device (HDMI 2.1 ready) and Oppo HDMI 2 to 2nd device (Not HDMI 2.1 ready)... then both Oppo HDMI will be disabled (black screen). I don't know so is this correct ?

I know that technology changes very fast but its still a bit early to talk about any HDMI 2.1 update or upgrade. Even if HDMI 2.1 support is completed, it should be a very long time to wait for 8K blurays to appear. We are just beginning to talk about 4K.
*
I belief I've partly mistaken on what saitong said earlier. My bad.

What I'm describing is INPUT lag, not system lag like you're explaining in your post. Input lag ties strictly to games and nothing else (unless you like playing with GUIs...?).

As I explained to bro Skylinestar, I was mistaken by Saitong's comment. So my mistake there, I was under the wrong impression that everything goes through the Oppo player.

The thing I'm mostly intrigued about the implementation of HDMI 2.1 is that it comes with the long awaited, critically important support for Variable Refresh Rate. That's really a game-changer for modern video games, where no longer a game needs to be tied a certain refresh rate, causing judder, drops or screen tearing. I'm sure you've heard of it already on the PC side via Gsync and Freesync.

SSJBen
post Jan 25 2017, 10:34 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Jcvendetta @ Jan 25 2017, 09:34 PM)
Hello All,
I am moving into my new house and is planning to have a dedicated AV room.
Already have my short throw projector, screen and 5.1 setup (all bookshelves) from my existing av room. My existing room is just a 9 ft x 9 ft room , very happy with the seating distance vs screen ratio.

My family is small and at one time the maximum number of people that will be watching movies is 3..so i plan to only get a three seater sofa.

Would love to hear your thoughts on which AV room is more suitable for this purpose. ROOM 2 or ROOM 3?

If Im gonna do it in Room 3, challenge would be setting up the screen at the wall that measures 9.19ft, how do cover up the window and install my screen over that wall?

For room 2, i will definitely place the projector screen on the wall that measures 13.98 ft, however how should I cover the  bathroom and room door? Would you recommend to replace it with glass sliding door with very dark tint?

Thank you.[attachmentid=8439182]
*
Definitely room 3. Easy to predict where to put the sub(s), room treatments and know where the room modes are. It's a fairly symmetrical layout.
I don't know if you're willing to just seal up the window entirely, but if you do then it's a simple process really. Otherwise just get a pull-down screen (motorized or manual, up to you) instead of a fixed one.

Room 2, I foresee there are going to be A LOT of problematic room modes. It's going to be a very time consuming process in sub placements experimenting, how best to employ PEQ, where to put treatments... lots of things. If you're not anal about sound, then I guess... it doesn't matter much whichever room you put your HT setup in.

This post has been edited by SSJBen: Jan 25 2017, 10:36 PM
SSJBen
post Jan 26 2017, 02:19 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Jcvendetta @ Jan 26 2017, 08:07 AM)
Thanks! To seal the window, i will probably ask my contractor to install a plywood over it. I think the bathroom door has to be changed to a very thick door too...coz the door is just to too filmsy and will give lotsa of rattling sound
*
A solid core door for a start will be good. Best if you can do 2 doors.
SSJBen
post Feb 21 2017, 06:22 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Kent3888 @ Feb 21 2017, 02:28 AM)
Which is a better option overall? Area to cover is 580sqft would say around 5,800cubft. Mostly for Movies, Fronts alrdy using Klipsch, not planning to play super loud to wake neighbour or tear down the wall. . Taking sealed subs into consideration, hence SB16 came into the picture

2x Klipsch R115SW or 1x SVS SB16 Ultra

Klipsch R-115SW
The Reference R-115SW subwoofer is the perfect combination of power, sophisticated and heart-pounding bass. Packing a serious punch, the R-115SW will not only wake your neighbors, it'll piss them off. The way only Klipsch can.

Features:
15" spun-copper Cerametallic woofer
All-digital amplifier delivers 800 watts of dynamic power
Front-firing slot port with exclusive internal flare technology.
L/R line-level/LFE RCA inputs for compatibility with most receivers
For a simple wireless connection, add an optional Klipsch WA-2 Wireless Subwoofer Kit
Dimensions: 21.5" H x 19.5" W x 22.3" D
Brushed black polymer veneer cabinet with satin painted plinth
SVS PB-16ULTRA

An unrelenting passion for awesome bass performance and engineering perfection guided every aspect of the SB16-Ultra subwoofer’s design. Groundbreaking technology, rigorous design and extreme testing in real world and laboratory environments allowed SVS to achieve massive output levels, extreme low frequency extension, near-perfect frequency response accuracy, and pinpoint transient response. The culmination of all SVS design advancements, SB16-Ultra represents the greatest leap forward in performance and innovation since the inception of subwoofers.

QUICK SPECS
16"DRIVER
1500 watts RMS (5000+ watts peak)AMPLIFIER
16-460Hz ±3 dBFREQ. RESPONSE
20" (H) 19.5" (W) 20.1" (D)DIMENSION
122 lbsWEIGHT
*
Err... how is 580 sq. ft = 5800cu. ft? You cannot directly relate them since cubic feet has 1 extra layer dimension (height). So which is which?

To answer your question, the PB16 Ultra is undeniably the better sub. It has a lot more output below 25hz than the R115. Even if you couple 2xR115 in the same area for maximum amount of gain (+6db~), the PB16 Ultra will still beat them below 16hz with ease. Of course, place them (R115) well and they'll sound smoother than a single sub. You have to decide which is more important to you, smoother response over more seats or more output for 1 (or 2) seats.

Now since your room dimensions isn't clear and I presume that 580 sq. ft is your floor space only, then both subs should (in theory) should cover the space quite well with relative headroom to spare. It's really all down to how big your bank account is.

Personally I'd say buy 1 PB-16U now, then buy another PB-16U later on when you have the funds. Buy once, enjoy forever.

SSJBen
post Feb 22 2017, 07:51 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Kent3888 @ Feb 22 2017, 12:35 AM)
I'm running single R115 now, considering whether to get another R115 or, sell it off to get SB16U. Fund wise, 2xR115 or 2xPB2000 is somewhere close to a single SB16U. Sorry for the typo, supposed to be SB not PB, as PB seems too big and bulky in the living, and there is also a price leap from SB16U.
That's how I assume 5800 cuft, 580sqft x 10ft ceiling
*
Alright, in that case... even dual PB-16Us aren't going to be enough. Since you said SB16U, that's even less output.

Dual PB2000s will get you more output than a single SB16u below 25hz though. How loud are you usually going to be playing your system btw?
SSJBen
post Mar 13 2017, 04:40 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Kent3888 @ Mar 13 2017, 09:05 AM)
New beast. Mind blowing power
*
So you are the one who bought from Chong recently (correct me if I'm wrong). thumbup.gif


SSJBen
post Mar 13 2017, 05:10 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Kent3888 @ Mar 13 2017, 04:43 PM)
Haha yea.... what a small world biggrin.gif  he told?
*
Saw it on his FB. The happy "couple", yes? biggrin.gif

Good that you're enjoying the SB16. A fantastic sub indeed.
SSJBen
post Mar 25 2017, 08:14 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(JML @ Mar 25 2017, 12:27 PM)
hi guys gonna join the club soon
got propose with below, need some advise.. thx

yamaha RC-V681  RM2900
Dali zensor 5 floor speaker 1 pair RM 3900
Dali zensor vokal center spk RM 1800
Dali zensor pico surround spk RM1600
Dali phantom E-50 ceiling atmos spk RM1900
Goldenear forcefield 3 subwoofer RM2500
Supra HD5 hdmi (12meter) RM1650
*
7 speakers (that are only average efficiency) running off a v681? Don't agree. Unless you have a small room or don't play loud, that receiver is going to stress itself a lot.

If you're going to only use one receiver for 7 speakers, you're better off getting high efficiency speakers. Look towards Klipsch RP series instead.
SSJBen
post Mar 27 2017, 04:57 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(sivanathan04 @ Mar 27 2017, 12:48 AM)
Oh any other solution to solved this problem without installing sub
*
Set your speakers to "Small" and set a crossover of at least 80hz. But understand if you do this and you don't have a sub, you're effectively killing off the under 75hz region of your main L/R/C speakers bass performance. Therefore, things will absolutely sound tinny.

The other way; get a 2 channel (or multi channel if you want) power amp and use that to power your main front speakers. You can then run them at "Large" i.e Full Range. However, the receiver or pre/pro will need to have pre-outs in order to achieve this, the v681 doesn't have pre-outs unfortunately.


QUOTE(Vannus @ Mar 27 2017, 07:41 AM)
*Can someone enlighten me? During surround 5.1 calibration using the provided mic, how does the active subwoofer sound? I felt it sound too soft and low compare to the rest of the speakers. I set crossover frequency 80 and vollume at 50% at the back of the subwoofer, from the receiver i select bypass. Hmm wondering if this correct.
Anyone know can share what is the best crossover frequency for front L, front R, center, surroud L and surround R that should set on the receiver?
*
Simply setting the volume of the sub to 50% (or 12' oclock) doesn't guarantee that the sub will sound loud. Why? Room Modes, that's why.

To put it simply;
1) Maybe your sitting position may be in a HUGE null spot, that's why you don't hear the sub much.
2) Maybe your sub is placed wrongly in accordance to your main listening position, there by again - being placed in a null spot.
3) 50% is not a standard volume to set for all subs. Sub A can be 50% and hit 75db standard from the main listening position, however sub B might have to be set to 75% in order to hit 75db spl.

There are a multitude amount of other factors. This is why it's highly recommended to get an SPL meter and VERIFY the volume output from your listening position.

As for crossover setting, 80hz is a good general starting point. But it's not a set rule where all HT setups must abide to. It depends on the room and it certainly depends on the speakers. If your speakers cannot play down to 80hz with authority, then setting the crossover to 80hz will leave you with an obvious gap from the speaker's roll off -6db point. There is no "best" crossover setting, unless everyone owns the same speaker, sub and live in the same room.

What receiver are you using anyways?
SSJBen
post Mar 27 2017, 05:28 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Vannus @ Mar 27 2017, 05:11 PM)
Thanks for the advice, yes I'm newbie to HT sound. I'm using Denon X1300W. Yamaha ns555, ns444 as F, R, C speakers. Surround back RL and RR I'm using Boston acoustic XS speaker. For subwoofer, I'm using the ancient wharfedale diamond sw150.

During speakers setup and test, I can hear the subwoofer sounds loud, when I do audyssey calibration, the subwoofer sounds pooney.

Getting complicated with come to sound wave length, positioning and tool to detect. I don't have any SPL tool. Anywhere I can get it cheap?
*
Okay, so audyssey. My suggestion is to do a subcrawl first.

Just place your sub at your main listening position, go squat at the areas where your sub can be placed and play some bass heavy music. Listen and feel where is the best spot then adjust the volume accordingly.

The idea is that you want Audyssey to give you a result of your sub's volume trim at a negative level. Anywhere between -6db and -3db is just nice. Set your sub volume until it gives you that readout on the volume level. Then after that you can "run your sub hot" by boosting that level to a +3db or so. It's important not to go over 0db because MOST receivers clip the sub output beyond 0db, which causes distortion.


You can get an SPL meter from Lelong or Radioshack (hurry before they completely close down for good). Even better if you can get an analog one, but if you can't then a digital one should be okay enough. They should cost no more than RM70.
SSJBen
post Mar 29 2017, 03:34 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(sivanathan04 @ Mar 29 2017, 07:19 AM)
Yup very true bro once I change to small and set the crossover to 80hz the bass is off...now my L/R bookshelf set to large and center set to small..now the boom from center speaker is off..now the sounds abit good
I prefer ypao natural..ypao front feel like more bass..i run ypao again after the correction now sounds a bit good...yup bro got graph there I didn't tweak it caused no idea will take photo later
*
I would still suggest to set all your speakers to "Small". Bass management is important for movies especially. But yeah, what sounds good to you is the most important at the end of the day.

YPAO front has more bass because all it does is EQ all your speakers to your front L/R, which often times has the biggest drivers in the speaker system after the sub.
SSJBen
post Apr 5 2017, 03:04 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


You can change all the power cables in your house, but can you change all the power cables coming to your house from the power station(s)?

Come on la guys. rolleyes.gif
SSJBen
post Apr 5 2017, 06:18 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Kent3888 @ Apr 5 2017, 05:54 PM)
Using shielded audiophile cables is to prevent interference after power being filtered from whatever filtering block and splitter. Assuming those contamination of noise within the TNB supply will be filtered out after the block
*
Ah, the good ol' "skin effect" explanation. So you're basically saying the AC line that comes into the house, after being converted into DC is not a "flat" line going into the components and audiophile cables will "filter" that DC voltage into something just linear, correct?

Explain to me then how does a cable which has no receptors change a DC line that is not flat, to be... "flat"? I mean, the power cables has no transformer at all between the A plug and B plug, so are you telling me that a passive IEC block can somehow change the incoming DC voltage?

I know this subject is touchy and I hope there isn't any offense, but I've never been given a straight answer of how an entirely passive cable can do something to a line-in voltage. I want to know so I can justify spending RM1k for a 2ft power chord thicker than my penis.

18 Pages « < 6 7 8 9 10 > » Top
 

Change to:
| Lo-Fi Version
0.1199sec    0.28    7 queries    GZIP Disabled
Time is now: 5th December 2025 - 02:12 AM