Announcement

Collapse
No announcement yet.

Interesting Q and more interesting A!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Interesting Q and more interesting A!

    Game PC first look:




    Q: While we know the card isn't in the final phases yet, is there any target Matrox is shooting for, performance wise for the Parhelia? Above the current leader, the Ti4600?
    A: Good question and I'm glad you brought it up. There have been rumors and speculation floating around the web for some time now and we are concerned that users may be expecting a frame rate killer here and that we're going to topple the likes of nVidia and Ati. The technology we have is compelling and industry leading, there is no question about that, however, performance won't be on the scale of 250 fps in Quake.

    The thrust behind Parhelia-512 was to offer a very high quality graphics card with tons of features that will provide more than adequate performance when the quality is turned up. Where the competition will drop off considerably, the Parhelia-512 will keep chugging along happily due to our deep shader and texture pipeline. But we are still a four pixel architecture and as such when running simple dual-textured benchmarks and games without anisotropic filtering enabled, our card won't be working very hard (you get aniso for free basically with Parhelia). We really want users to understand that with Parhelia-512 the quality and feature set are the real selling points, with the performance delta changing little when quality is turned all the way up. There is just so much more to this market than simply making your games play faster, you've got Longhorn on the horizon with 10-bit computing, multiple 3D desktops and all those cool DX 8 titles on the way, not to mention DX 9 not a long way off.
    Cheers, Reckless

  • #2
    Yes ... I find it amusing that people are still concerned about frame rates on a previous gen game like Quake 3 with a next-gen chip like the Parhelia. I think gamers should be looking at the features, quality, and performance of the latest games as well as upcoming games like Doom III. Quake 3 and such should perform quite well with all the bells and whistles as well as being able to exploit the surround gaming feature.
    <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

    Comment


    • #3
      Here's what I think is the basic problem. You can't easily put quality into an excel graph. People, especially most internet users, want their information in quick, easily digested bites.

      So long as the reviewers take the time to research the product and compare apples to apples. If they include antialiasing and anistropic filtering in their benchmarks, it doesn't sound like the Parhelia will fare very well.

      See... this is what its all about. You can spin numbers to create nearly any sort of result you'd like... You can run tests to make the GeForce 4 look killer, or to make the Parhelia seem like the greates thing since sliced bread.

      Comment


      • #4
        Of course.
        This is exactly what nVidia has been doing for years by faking trilinear filtering, when anyone who's studied it knows they only use a buffed up version of bilinear, and call it trilinear. To the uninformed, it reads as higher FPS. To the knowledgable, it reads as inferior quality.
        Or bragging about their AGP4x speeds when they only use DMA AGP xfers, while Matrox's DIME AGP implimentation still nearly matches or beats others even in 2x mode.

        As to your speculations about comparing FAA performance, I think you will be pleasantly surpised when you see benchmarks made public. The performance penalty on a 512bit gpu will be much less than it is on the 256bit cards out today
        Core2 Duo E7500 2.93, Asus P5Q Pro Turbo, 4gig 1066 DDR2, 1gig Asus ENGTS250, SB X-Fi Gamer ,WD Caviar Black 1tb, Plextor PX-880SA, Dual Samsung 2494s

        Comment


        • #5
          Ah. The joy of spin doctoring. Every public relations department's job is to make their product as appealing as humanly possible, even if it means using creative naming conventions that aren't exactly accurate. They gamble that the average consumers and journalists won't dig deep enough to discover the truth.

          Personally, I don't think that NVIDIA makes a bad video card. I just don't think that they pioneer quite as much as they would like people to believe. They push older technology to its limits, because it's more cost effective from an R&D standpoint. Matrox appears to have a very different design philosophy where they push the envelope of technology, putting more money into untested areas. It's a bigger gamble, which often makes public companies and their shareholders nervous and unwilling.

          Comment


          • #6
            Oh so what your saying Chrono is that instead of actually developing ground breaking technologies for themselve, nVidia just theives it from company's like 3DFx, Matrox and SGI, while spin doctoring like Kruzin said then pimping it to the ignorant mini's of the world as gospel? Yeah that's it!
            "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

            "Always do good. It will gratify some and astonish the rest." ~Mark Twain

            Comment


            • #7
              nVidia has its own technologies, but what they do is push it to the limit. Look how long the GF line of cards has lasted. Sure, they put in a few new features along the way, but the core of the chip is the same. Look how far it went. I was really disappointed in the GF4 because it really didn't offer anything new to the graphics market. Sure it was faster, but that was about it. Parhelia is the first thing I have seen that has actually made me cheer. It uses new technologies and new approaches at 2D and 3D, and it doesn't just try to be "faster." I still want to wait and see what ATI and nVidia have to offer with their R300 and NV30, but if they don't have anything groundbreaking like Matrox has (in other words if they are pretty much just faster cards [even if faster than Parhelia] with DX9 features), then I will go with Matrox for sure.
              System Specs:
              Gigabyte 8INXP - Pentium 4 2.8@3.4 - 1GB Corsair 3200 XMS - Enermax 550W PSU - 2 80GB WDs 8MB cache in RAID 0 array - 36GB Seagate 15.3K SCSI boot drive - ATI AIW 9700 - M-Audio Revolution - 16x Pioneer DVD slot load - Lite-On 48x24x48x CD-RW - Logitech MX700 - Koolance PC2-601BW case - Cambridge MegaWorks 550s - Mitsubishi 2070SB 22" CRT

              Our Father, who 0wnz heaven, j00 r0ck!
              May all 0ur base someday be belong to you!
              Give us this day our warez, mp3z, and pr0n through a phat pipe.
              And cut us some slack when we act like n00b lamerz,
              just as we teach n00bz when they act lame on us.
              For j00 0wn r00t on all our b0x3s 4ever and ever, 4m3n.

              Comment


              • #8
                i think that was a really good review in terms of non bias, aswell as being easy to understand

                i think performance on older games is a non issue, quake 3 has been fast enough for the past 2 years, who cares if it is any faster today??? the main factor behind parhelia is the supreme QUALITY of the graphics at good frame rates. this is where parhelia will blow away everything else. if reviewers just concentrate on q3 at 1024*768 they will never see the true potential, they need to enable anistoric filtering, faa 16 and 10 bit colour on all the newest games to see just how far parhelia is ahead of everything else
                Dell Inspiron 8200
                Pentium4m 1.6
                640mb pc2100
                64mb gf440go
                15" uxga ultrasharp
                40gb 5400rpm hdd 16mb cache

                Comment


                • #9
                  just found a faux-pas on page 3:

                  In typical Matrox fashion, 2D image quality will be absolutely superb, thanks to a new 10-bit DAC pipeline, which allows for 4x as many colors as current consumer-level VGA adapters
                  uhmm ... 4 times as many colors per color channel would be apropriate and would equal to 64 times as many colors overall.

                  Despite my nickname causing confusion, I am not female ...

                  ASRock Fatal1ty X79 Professional
                  Intel Core i7-3930K@4.3GHz
                  be quiet! Dark Rock Pro 2
                  4x 8GB G.Skill TridentX PC3-19200U@CR1
                  2x MSI N670GTX PE OC (SLI)
                  OCZ Vertex 4 256GB
                  4x2TB Seagate Barracuda Green 5900.3 (2x4TB RAID0)
                  Super Flower Golden Green Modular 800W
                  Nanoxia Deep Silence 1
                  LG BH10LS38
                  LG DM2752D 27" 3D

                  Comment


                  • #10
                    Okay Greebe... that's not exactly what I meant. I should have written more clearly. NVIDIA is in stage of the development cycle where they're milking previous technology advances by bumping the clock speed and tweaking the tech. All the companies do it, with Intel being especially adept at it. There's nothing inherently wrong with that so long as it's done in moderation. New technology takes a whole heap of time and money to research.

                    Comment


                    • #11
                      The human eye cannot see higher than 30 Frames per second on video, For 3D games let's say 30-40 any thing higher than that is wasted. However the ideal is a constant 30-40 regardless of activity. People do not care for frame rates, they want decent frame rate of course, but visual quality and performance is more important, look what happened to 3DFX, they gambled that people would go more fore speed than quality and lost .

                      I'll take a game with crisp detailed graphics with resolutions over 2096 x 1536 x32bit and beautiful shading at 30fps over a game with 1024x768 @ 200fps any day hands down.
                      You may be on top of the heap, But you're still part of it.

                      Comment


                      • #12
                        It was actually when 3Dfx switched from speed to quality that they lost. Sad but true.

                        But Matrox's target audience is a whole other bunch of people!

                        Comment


                        • #13
                          Ah. As an old eliteist, playing CS, DOD etc, FPS is the only thing that matters. It REALLY DOES improve Your score to go from 30 to 60 fps. I switched down in details and resolution, to get that extra reaction speed.

                          Besides: The human eye CAN be fooled by as little as 25 fps, but it is strained at anything less than ~80fps. Check Your monitor settings, if you do not belive the old biologist..

                          ~~DukeP~~

                          Comment


                          • #14
                            The monitor is another issue my friend, you need atleast 85hz because the phosphors can't keep the illuminance for long therefore anything less will be perceived as flickery.

                            Just for example, look at TFT screens, even at a refreshrate of 60Hz they do not flicker.
                            Last edited by Novdid; 15 May 2002, 11:09.

                            Comment


                            • #15
                              Interesting discussion. But anifakis, the 30-40fps thing is totally wrong. Movies and television aren't the same as watching 3D rendering on a computer monitor. For one thing, the framerate is ALWAYS consistent with those media. For another, televisions are interlaced, thereby effectively giving the appearance of higher than 30fps (due to interlacing). Computer games, even with my G400max (which of all cards I've owned so far, had the least percent variance between minimum and maximum framerates in most games), vary in framerate. Further, computer monitors aren't interlaced. The eye CAN distinguish greater than 30 fps (in fact, 60fps is still not usually great- which is one reason 60hz refresh rate on the monitor contributes to eye-strain).

                              In slow moving games or renders, where motion is low, 30fps might maintain a semblance of smooth. Anything having fast motion will show stuttering or flickering at that framerate.

                              (personal opinion part, but since I play mostly fast paced games, and mainly 3D first person shooters):
                              Ideally, the monitor's refresh rate should be at or above 70Hz (I prefer 75, but to each his own), and the minimum framerate should be equal to that, so that in the most stressful rendered situations (a small map with 20 players, all firing rockets pell-mell at each other, ask any MU UT player ), the action is still smooth, and there should be no stuttering.

                              The Parhelia should be fine in this aspect (or so I expect), at least allowing most of the same features and options to be enabled that any other contemporary 3D accelerator now (and possibly upcoming) can have enabled, and still achieve 60fps or better. I actually expect it will do better than most, since my ideal of features enabled for rendering includes enabling trilinear filtering, anisotropic filtering, and AA, at a resolution of 1024x768 or higher, with 32 bit color depth. With the Parhelia basically offering for free trilinear + 8 tap anisotropic filtering, I'd be in heaven if I could afford it now (no hats here, my GF4 does well enough for now. It's just that the P will undoubtedly do much better still).

                              So please, it's far beyond the time that we need to justify lower framerates in things with that worn out excuse.


                              Disclaimer: As I said, only slower motion titles can get by with this- some recent realtime RPGs may fit here.
                              "..so much for subtlety.."

                              System specs:
                              Gainward Ti4600
                              AMD Athlon XP2100+ (o.c. to 1845MHz)

                              Comment

                              Working...
                              X