PDA

View Full Version : What exactly is Matrox up to .... get the <facts>



Pages : [1] 2

SmellYfeeT
27th October 2001, 13:50
Hello everyone,

Its been a while that I haven't checked this board :)
Last night I had a phone conversion with a friend of mine who works at Matrox and I though I ask some stuff about all the romours around.

1) Matrox is NOT going out of business: He found it very interesting that people really believe that M is packing its bag. Matrox has not laid off anyone recently, however they are not hiring and they do not try to fill any vacant positions and he said thats a major factor beside the current market that M is not aggresive. Basically their R&D team is shrinking more and more.

2) McDonalld and Matrox ????!!!!! He told me that Matrox has signed a big contract with McDonalld to sell them G550. First thing I said was WHAT !!! No , no no , McD. is not adding internet cafe :) However, they will be replacing their Menu signs with 6 Flat panel Displays and will use 3 G550 and its dual DVI feature to run those Flat Panels. Pretty cool I guess. As far as I know that is only for US. So for you guyz in England and Germany, you still have to order from those cranky plastic signs. While we order our BigMac from a FlatPanel which brings out the flavour with its high quality color display "POWERED BY MATROX"..... ;)

3) On going Projects: He told me that Matrox is working on 2 major projects right now. In Montreal they are working on the Mysterious next generation chip and in Toronto and somewhere in US they are working on its follower. He didn't know much about the next chip that will follow later on but he said they already have a prototype of the Matrox next gen chip and are working on its next revision that MIGHT go out the door. He wasn't really sure what the management is up to and what is going to happen but he didn't think they would release it before Christmass.

So there you go for those people in doubt, Matrox has a new design ready for release but they are not trying too hard to release it. I tried asking him about the size and some spec but he didn't really wanna say anthing. He is a very loyal employee :) As far as the name goes, they call the design Paheelia internally or something like that but he said he doesn't think its name is going to be G800 or G1000 as the romours imply. by the way forget about Direcx8 , matrox new chip will be partially directx9 compatible if it ever comes out!

PS. When was the last time you smell your feet? :)

-SmellYfeeT

Tempest
27th October 2001, 14:32
This, if true, is very comforting. And they probably call it Parhelia (http://ww2010.atmos.uiuc.edu/(Gh)/guides/mtr/opt/ice/sd.rxml).

http://forums.murc.ws/showthread.php?s=&threadid=26894&highlight=parhelia

Wulfman
27th October 2001, 14:44
the photo from your parhelia - links looks very similar to the matrox-logo... :p

mfg
wulfman

JF_Aidan_Pryde
27th October 2001, 16:29
DX9...yeap - I smell the displacement maps. :)

MK
27th October 2001, 18:51
SmellYaFeet - thanks for the info !

MK

KeiFront
28th October 2001, 03:47
Thanks SmellYaFeet for the usefull info.

borat
28th October 2001, 04:38
i dont think that we should get too exited about this news.
after all it may just be the next generation of the g550 that they are working on.
dont get me wrong i want to see a high end board from big M as much as the rest of you its just that i am so used to being dissapointed by them that i find it hard to take these claims seriously. i remember that this time last year i was waiting for the g800 which never arrived.
i cant help thinking that this time next year we'll all be speculating about a next gen board after matrox release their g650 with the pathetic addition of something like bodycasting.
looks like a radeon 7500 for me then if matrox do release something (fingers crossed) i will have enough penies to buy it.

mdhome
28th October 2001, 09:28
Borat, in most circumstances i would assume that they wouldnt make the same mistake again. However the part that does worry me from smelly's post is the bit about their diminishing R&D team. You cant design these chips with a skeleton crew.

regards MD

mute
29th October 2001, 01:42
Would you like fries with that g800 ?

well , think the market right now is dangerous , they're playing it safe i guess ..

roadie
29th October 2001, 10:16
at lastr some news that matrox is actually doing something!!!

i am intrigued by the dx9 statement, this could be a big gaming feature, but it could also be an extension of the 'headcasting' engine :(

however i am glad to see they are working on 2 new chips- six moth cycle anyone???

borat
29th October 2001, 13:42
yeah dx9 would be nice but the g550 partially supports dx8 through its headcasting features which are nigh on useless

SwAmPy
29th October 2001, 17:26
I sure hope that alot of people in Matrox knew about the MacDonald's deal or else your source will be toast. They may even accuse your source of being Casey.

SwAmPy

Ant
30th October 2001, 00:20
Who's your source the PR guy!? People have been laid off.

KngtRider
31st October 2001, 12:07
Originally posted by JF_Aidan_Pryde
DX9...yeap - I smell the displacement maps. :)

Any matrox chips do displacement mapping? and what is it exactly i forgot :p

I found this on the Meltdown 2001 web site

(ms dev conference for dx9)

SESSION 5
Displacement Maps As A New Geometric Primitive -- Matrox_Displacement.ppt (1130KB)
Speaker: Juan Guardado - Matrox Corporation

http://www.microsoft.com/mscorp/corpevents/meltdown2001/ppt/Externals/Matrox_Displacement.ppt

MK
31st October 2001, 13:28
Welcome on board KngtRider !

MK :)

JF_Aidan_Pryde
31st October 2001, 13:56
No chip on the market does displacement mapping, I'm 99.99% sure that Matrox's next *real* chip will feature it.

KngtRider
31st October 2001, 21:18
well you can dismiss the .1 because a matrox 3d engineer presented displacement mapping and thus they DO have it in their new GPU, its just an issue of thaty they will be up against s3, PowerVR, nvidia, ati and also if the gpu is has a limited set of caps ie a la g550 it will not be very appealing

their whole current family of chips dont do te following, and im afraid their future products will follow the 'unified' model and still have limited hw and sw feature set

*no overlay adjust in driver
*no adv 3d or video controls in driver
*complete lack of any MPEG2 assist (wheter or not it is usefull for pic qual is beside the point)
*poor multitexturing at the moment
*their regieme regarding driver features/updates/bugs
completely different to the others
*very fixed performance, not very dynamic (ie crap perf on 1.0 systems and above)

mdhome
1st November 2001, 00:12
Hello KngtRider
Just a question. How can you be certain that Matrox's next part will be as limited as you believe it to be?

regards MD

KngtRider
1st November 2001, 01:00
i dont know it will be

bad wording on my part, take it as 'i hope not', sorry

Wulfman
1st November 2001, 14:08
I followed some links from 3dcenter.de and found the following:

http://www.p3int.com/news_center_Matrox_support_for_NWO.asp


?Matrox Graphics is pleased to be working with Termite Games and Project Three Interactive in insuring that our hardware is fully compatible with New World Order,? says Mike Drummelsmith, developer relations manager, Matrox Graphics. ?The visual effects in New World Order are simply stunning and, when paired with Matrox?s trademark image quality, result in an action packed gaming experience.?

If you look at the techdemo - is the g550 capable of that???

http://www.p3int.com/downloads_NWO.asp

Looks like matrox is more interested in gamers than they say...

mfg
wulfman

GT98
1st November 2001, 19:27
I dunno how much water this holds.....I remember in the the G550/G800 days there was a delevoper who was working with matrox making up a Kung Fu fighting game based off the Unreal engine...I haven't heard anything in ages about the project. I think If you do a search in the Hardware forums you can find something about it....it was pre Crystal Ball days.


Scott

TdB
2nd November 2001, 05:21
wasn´t that kong fu game supposed to use matrix skinning, it was before we heard about displacement mapping IIRC

SmellYfeeT
3rd November 2001, 04:56
I sure hope that alot of people in Matrox knew about the MacDonald's deal or else your source will be toast. They may even accuse your source of being Casey.

For a sec. you got me worried there, cause if anything happens to him I would be dead in no time :) but this is not a recent news I heard about this from one of my coworkers during the summer, I think it was originally leaked out on a stock message board on McDonnald long time ago but there was no reference to Matrox!? so now we know who the supplier is ;) I wonder if the deal is still up or the downturn in economy scrapped it all... :(
I was surprised that Ant posted the message on the front page since it really does not contain any useful information except for a promise! well, I know some other stuff that I really don't think is appropriate to post publicly since it might put Nvidia and ATI at an advantage point which we don't want , right! :) but I can tell you if Matrox can continue to deliver traditional stable drivers for their gaming card, it will be a big seller. However, the management is playing it smart and they don't want to do what 3dfx did before they bailed out! which means don't put a product on the market if you can gues will not sell!

Ant
3rd November 2001, 06:51
Any news is news when dealing with the tight lipped Canadians, and you did state they were "facts" :)

Tom
3rd November 2001, 11:28
Originally posted by Ant
tight lipped Canadians:)

No comment.:p

K6-III
3rd November 2001, 14:43
According to Reactor Critical, Matrox has followed ATI in selling chips to other manufacurers. This may be what we can expect from them in the future.

Liquid Snake
3rd November 2001, 17:05
Yeah, there was also some guy who posted here claiming his company was "almost done" with the G800 techdemo and that was probably a year ago, or more.

Sasq
4th November 2001, 05:25
Not sure whats driving them here (Japan) but some of the McD's are using the flat pannel displays here already - at least the last 6 months or so

Dan

SmellYfeeT
4th November 2001, 08:26
so they have video advertisement and stuff on those flat panels or is it still static text and graphics!

Sasq
4th November 2001, 16:18
Advertisments, mini adds. this is the breakfast menu wouldn't you really like a hash brown etc. Pretty cool actually.
Half the pannels or so keep the menu up (second head??) the panel that they put the current special up on on a static display becomes a rotating advert, however its animated

Dan

SmellYfeeT
4th November 2001, 20:13
well , what do you know, maybe McDonnald will use the HEADCASTING to take orders soon :)

Peter Aragon
5th November 2001, 06:26
I only have one thing I want to say:
I want that video card that www.p3int.com used for making that real time demo.

Haig, can you at least say if that's a non-Matrox card they used and when on what (gamers)event we can see it in action ?
:D

Kind regards,
Peter Aragon

Haig
5th November 2001, 09:56
If I gave you the standard no comment reply, that would make it obvious.:D

Haig

Helevitia
5th November 2001, 10:25
Interesting. Here is a blurb from the p3int site....

"Project Three Interactive’s upcoming release, New World Order, got yet another boost today as Matrox explicitly pledged their support for the highly anticipated shooter. The game, being developed by Swedish studio Termite Games, is set for March 2002."

Just grasping for straws like everyone else :D

Dave

Flip-Oh
5th November 2001, 10:54
Hey Peter,

Are you sure you want that videocard?
Because at the end of the demo it says that they used a Geforce 2 GTS (and an AMD Athlon 1GHz) ;-))

(Or are you talking about another demo?)

Greets,
Jorg

Peter Aragon
5th November 2001, 13:17
Thanks Haig for reassuring me that the wait won't be for nothing ;)

As for Flip-Oh:
[Dutch mode ON] Godverdomme man, je hebt me zover gekregen om die demo alweer te downloaden[Dutch mode OFF]

(translation: you bastard you made me look ;) )

I was talking about ftp://ftp.p3int.com/NWO-movie2.avi

I especially like the motion blurring effects, stunning! I don't beleive a Gefarce can do that, at least not realtime and not even with a Gefarce 3

Regards,
Peter

Tom
5th November 2001, 16:12
Here's something interesting ...
the Rendering for N.W.O......and Matrox is working with P3i..

-Efficient graphics/texture compression/decompression, without any quality loss
-Supports hardware T&L for increased throughput
-Advanced motion blending / smoothing

There's more at

P3i (http://www.p3int.com/product_center_nwo_cd_rendering.asp)

Jorden
6th November 2001, 10:19
Originally posted by Peter Aragon
As for Flip-Oh:
[Dutch mode ON] Godverdomme man, je hebt me zover gekregen om die demo alweer te downloaden[Dutch mode OFF]


Je wilt toch niet klagen dat je verbinding zo slecht is dat je geen 110k/sec haalt op je PI-Advanced, he? ;)

Jord.

Peter Aragon
6th November 2001, 11:34
Originally posted by Jorden


Je wilt toch niet klagen dat je verbinding zo slecht is dat je geen 110k/sec haalt op je PI-Advanced, he? ;)

Jord.

(translation: You don't want to complain that your connection is that bad you don't get 100k/s )

Tja, duurt toch weer 5 minuten met 208 k/s :cool:

(Translation: Well, it does take another 5 minutes with 208k/sec :cool: )

Flip-Oh
6th November 2001, 11:45
(Oeps is Dutch for oops ;-)


My bad! I thought you were talking about

ftp://ftp.p3int.com/NWO-tech.exe


I was wondering what all the fuss was about, since that demo doesn't show anything amazing except for a VERY high detail level. D/loading the other demo now. Let's see what you're so excited about ;-)

Groeten,
Jorg

P.S.:
Ik moet het met 50k/s doen, maar daar is ook nog best op te wachten ;-)

(translation: I have to make do with 50k/s but that's still not bad ;-)

KeiFront
8th November 2001, 13:10
New rumors about Gefarce 4 (and GF3 Ultra)

http://www.chip.de/news_stories/news_stories_212205.html

Maggi
9th November 2001, 05:48
Originally posted by KeiFront
New rumors about Gefarce 4 (and GF3 Ultra)

http://www.chip.de/news_stories/news_stories_212205.html

I wouldn't call an article dated from August 15th new and I guess the GF3-Ultra is now named Titanium ... ;)

Maggi
9th November 2001, 05:49
PS: here's the english version of that article ...

http://www.chip-online.com/news_stories/news_stories_5419.html

:)

KeiFront
9th November 2001, 06:36
OOPS my mistake :confused: .

[GDI]Raptor
11th November 2001, 15:52
Why isn't is any rumors about any upcomming Matrox graphics cards?

mdhome
11th November 2001, 23:06
Originally posted by [GDI]Raptor
Why isn't is any rumors about any upcomming Matrox graphics cards?

possibly because no bugger seems to know anything:)



regards MD

Peter Aragon
12th November 2001, 00:02
I'll make sure I will be in Hanover for the CeBIT in March for the big unveil, I hope it's something from Matrox.

It wouldn't matter to me if they would release something sooner than that. But my Matrox pen is getting old, I can't wait to see the new design ;-)

K6-III
12th November 2001, 04:01
I'd expect some closer to the anticipated launch date of February.

Ant
12th November 2001, 05:27
OK you want a bit of a rumour, here goes. The next generation chip from Matrox, the Parhelia, has the potential of being faster than a GeForce 3 and ATi 8500 in 3D. It can support up to 256MB of DDR RAM. It was due to be announced this Autumn but has been delayed until next Spring. Whether Matrox have the resources left to see the R&D potential of the chip fulfilled as a retail product in another issue. Discuss...

breezer
12th November 2001, 05:56
Wasn't Parhelia another code name for a g4xx core chip?

Ant
12th November 2001, 06:33
Nope.

Novdid
12th November 2001, 10:45
The name Parhelia on a retail product would be wonderful.

Just listen "Matrox Parhelia MAX".:):)

[GDI]Raptor
12th November 2001, 11:13
Hmm. If this card is to be released "spring 2002" then it should be as fast as the upcoming GeForce4, and the next Radeon!

CeBIT..... Hmm.. this tear I'm going to travel to Hanover myself and visit this trade show, so perhaps I can get my self a Matrox pen to.... :-)

We should hear something from Matrox before chrismas, or in Jan/Feb ig this card is going to be released "spring 2002" Perhaps some "hints" at Comedex........?

mdhome
12th November 2001, 11:58
Originally posted by Ant
Whether Matrox have the resources left to see the R&D potential of the chip fulfilled as a retail product in another issue.

Yee Gads, I hope they release this thing. If they do they can have my order (Assuming its not 2 years late)

MD

Ali
12th November 2001, 12:52
I would say Matrox could get the Parhelia out as a retail board, its just do the want to?

It would cost the same for them to get it retail ready as it did for them to release the G550 wouldnt it?

It would make sense that if they only had the resourses to release one card, the G550 would be it, because it would make them more money.

I dont personally see the point of the G550, as for business users the G450 is identical, and (hear at least) is under half the price.

As long as Matrox is still making more money from selling their cards as they are spending developing them, we shouldnt have much to worry about.

Ali

Helevitia
12th November 2001, 14:01
thanks Ant :)

SmellYfeeT
12th November 2001, 21:07
I would like to confirm what Ant stated as the potential of the Parhelia is true. Why don't you post something about the Memory bandwidth of Parhelia? Seems like you have the right information :) I'm sure poeple will love it. The more I hear about the structure of Parhelia the more I think it will be a kick ass card!I also highly believe matrox will come out with a complete family of products using this technology! I heard yesterday ATI will be announcing a new product in the next 2 weeks! I have no idea what it is! any information anyone?

Hati
12th November 2001, 22:05
I only wish Matrox would announce a new product in the next weeks. After upgrading my system from PIII 800 to AthlonXP1700 I would like to replace my G400 soon.
I wonder what keeps Matrox from getting the new chip out. Looks like they could do so, but dont want to:confused:
I am really tempted to get a Radeon 8500 if ATI gets their drivers sorted, but then reading - again - about something new from Matrox for high end 3D / gaming I think I could wait some more time.
"Spring 2002" ? Hope Matrox at least announces something a bit earlier to make me know what I'm waiting for. It's getting pretty frustrating
:(
Hati

KngtRider
12th November 2001, 22:37
from reading the lastest posts this is what i think

*there is no way in the world the new product will be launched before xmas, timeframe is too short and we would have heard something about it already besides the xmas/ny perioid is in the way

*maybe they will launch it in conjunction with dx9?
-first drivers for 2k
-first indepdant display for 2k
-ie integration
-first WHQL driver for xp

all of these were suprises and for a matrox ts didnt say anything about their upcoming indepedant displays functionailty

since there was matrox features presented at meltdown then it wouldnt be suprising if the new chip was launched in conjunction with ms/dx9 ?

i am not intrested on wether the card is shipping or not within the next decade, just wanna see some info/specs. maybe its not good for gamers


*nvidia have crap img quality and video and general compatiblity issues, some games have major issues

*ati need to work on their xp driver for the r8550 although their chips have potential, its not being shown right now, game issues

*matrox are good in all the fields except its too bloody slow in games

some info on the new part would make things much easier

i require VIVO and gf3/8500 level speed and feature set

the nvidia owners who have ridiculed me for owning g400/the card will get payback

Unam
13th November 2001, 12:03
Hi KngtRider,

I agree that the nVidia image quality is not on par with Matrox (who is?) but it is not "crap", I'm running a MX2 at 1024x768x32 with no problems at all.

It is a given that Ati could really use a lesson in writing drivers, but I remember Matrox's OpenGL fiasco all to well...

I love my G400 but it is getting very long in the tooth, the G450 and G550 were nothing more than half hearted attempts to keep people interested. It worked because I still have my G400!

It is time for Matrox to step up, if they don't release something respectable soon (within the next few months - some may argue that it is already too late) they will be relegated to also ran status.

An interesting thought may be that this is exactly what Matrox wants! Matrox has the resources (technical and financial) to release a high end product, the question I have is why wouldn't they?

mute
13th November 2001, 12:58
http://www.matrox.com/mga/dev_relations/dev_events/home.cfm

check this out , why would matrox go to the game dev conf , and e3 in 2002 , surely not to show off their existing cards

press release in march and production this summer ?

mdhome
13th November 2001, 13:59
I hope that it appears before next summer. That would make it 3 years between the introduction of the G400 and its replacement.

Unam, I cant see why Matrox wouldnt want to release a high end card. What possible reason could they want for becoming 2nd best?


regards MD

efty
13th November 2001, 15:24
How do u guys know that Matrox has the financial ability to release a new card?

It seems 2 years without a new card and the lose of so many oem clients has "killed" matrox.

Some outside financial boost would be good right now, but would do that at this time.

I still have a Marvel G200. I am totally satisfied with this card despite the win2k mjepg problems. I have seen many cards.. Nvidia, 3dfx all of them do not have the crispness of my g200.

What i want from matrox is another card that will last me another 4 years :)

Terrafin
13th November 2001, 16:40
I agree with Unam...

I myself have had a G400MAX :cool: for a while now (hehe got suckered in with the bump mapping feature and 2D excellence) and although i love the card and only play games on a regular basis id still like to know my v/card has the power to do whatever...

i was planning to buy the ati 8500 but the drivers held me back and i bought a toshiba satellite 3000 series instead for work. :D and i have to tell you the geforce2go in it is faster than my max! and as Nvidia's latest mobile solution - the NV17M, is comming out it just really shows how far matrox is falling behind in the 3D market area... although i wish they would use there chips in the mobile market... lets just hope when or IF they do come up with something that it can also be intergrated into laptops and really try to span there creative ideas into all areas of the computer market... maybe there having problems shrinking the die size? who knows...

Unam
13th November 2001, 17:05
mdhome

My thought here was Matrox has completed a market analysis and discovered that being king of the hill as far as 3d speed is concerned doesn't pay...

If you are already acknowledged as the best in 2D (where most everybody works) and you have passable 3D, where is the return in being the fastest on both sides? If you can make your margin at current levels...

Ugly thoughts, but I thought it may spark some comment!

eftychios

I don't have access to Matrox's books (given that they have no shareholders - nobody else does either!) but if they can release the last two pieces of hardware (450 & 550) then they have the resources to release something worth while.

My two cents worth...;)

Kruzin
13th November 2001, 18:20
Or it could be that M has lost some talented engineers over the last year, so development of their latest technology has taken longer to develop than originally anticipated...
:rolleyes:

Could be that when this new mystery chip comes out, it will rattle the graphics industry.

Or, could be a G550.5 with bodycasting

Hmmmm.....

:confused:

:D

KngtRider
13th November 2001, 22:09
yes i know nvidia 2d is decent, but you (all) know what i mean......

well the g400 is more advaned than the voodoo3 series when it comes to future proofing and general feature set

(do not compare glide and gl i am talking on general features and d2d/d3d)

matrox does have resources to make new designs to make a design with world first features (mill1, g200, g400 etc etc etc etc)
is not easy

they will release a new card, but the question is when

nvidia became bigger when tnt2 came out and their dets program ramped up.

recall nvidia was nothing once, and matrox was leagues ahead

matrox is VERY close with MS on graphics matters and hw support
new part would be a killer part, but timing, target market and cost will be killler

remeber voodoo5 ?.........

Unam
13th November 2001, 22:36
Could be that we will all be very old before big M releases that new chip Kruzin. :p

I remember the Voodoo 5 very well KngtRider. I remember when 3dfx was king of the 3D, but I also remember when Matrox was the best in the video card industry. If my failing memory serves me correctly that was a long time ago! Still have that Mill II in a box some where - could be that is where my beloved G400 will end up as well... :rolleyes:

mdhome
13th November 2001, 22:44
Unam
I'm not so sure. They may still make the same amount of money on a card that costs £100 as one that costs £200 but there is a knock on effect from having one of the fastest 3d cards on the planet. Its called visibility and is something that makes people (oems) sit up take notice of you. A cut down version of their next gen could be the product that is being sold in every oem pc.

Another ugly thought. With their excellent 2d and poor 3d, Matrox current offerings are unbalanced.


regards MD

Unam
13th November 2001, 22:53
Points well taken MD, my comment would be you would actually have to release something that catches an OEM's attention for that to work... we have not seen anything like that from Matrox in a long time.

IcedEarth
14th November 2001, 03:45
Matrox will have to come up with something soon because apparently, the Geforce 3 Ti500 that Leadtek has released, has superior 2D image quality. And that's the main reason I've stuck to Matrox all this time; 2D image quality.
At www.firingsquad.com they've tested a Ti500 from Leadtek and they say it's better than Matrox image quality.....

Amiga Blitter
14th November 2001, 09:53
The Canopus Geforce3 also have a super 2d image quality. The canopus uses very high quality output filters.
The Elsa Geforce3 have a good 2d image quality.

windigo
14th November 2001, 19:01
mmmm canopus havent heard that name in awhile :)
bought a voodoo from them long time ago to go with my
mystique220(still going strong in my work machine btw)

JF_Aidan_Pryde
14th November 2001, 23:49
..bodycasting?

PUAHAHAHHAAHAHHAHH--## .. sorry. :D

RedRed
16th November 2001, 02:12
Jade Falcon:

This body casting/arse casting could have potential with the 'men in dirty coats'!!!!

:rolleyes:


:eek:


:o


errr.... I wouldnt, buy one....... honnest (but I wonder if it was spanned across 2 monitors........:D )

dual body casting ohhhhherrrrrr!


RedRed

RedRed
16th November 2001, 02:15
looking at the list of upcomming shows, it does appear to be predominately game ones tho......

might be an intresting year.... if they can hold out.....(if I can hold out.....)

2Whyzzi
16th November 2001, 12:35
Originally posted by KngtRider
...matrox is VERY close with MS on graphics matters.. right and thats why Microsoft chose nvidia for the X-BOX...
...matrox's part would be a killer part, but timing, target market and cost will be killlerThat is what worries me the most. What if Matrox's new part isn't close enough to nvidia's & ATi's new parts -- will Matrox still release it?

RedRed
16th November 2001, 15:08
I think that matrox video cards are probly never going to be 'up there' in three D....

I dont think that should matter. I would be happy with GF2 (ultra) or there abouts, but with DH and the perfect picture quality we expect. It would be nice if it had a few extra feature, but i wouldnt 'bust a gut'....

The top of the 3d polly counts is ok, but it only last for a while (a matter of weeks with some cards!!!) It would be better to produce a mature card, which has the investment potential of scalability, but with the 'reasonable' performance.

the g450 / g500 were both dogs in this respect. They have the usual 2 d performance, but not the balance of reasonable 3 d - lets face it (an we all have), they are completely shagged in the 3d department.

the G400 was good in its day, and the reliance on a beefy processor (for respectable 3d) kept its longevity for a while. But it needs a revamp. a real one.

Do matrox have the talent/resource left to get themselves over the 'hump'? - it is the 64K dollar question....

Regards
RedRed

JF_Aidan_Pryde
18th November 2001, 05:27
..being king of 3d doesn't pay

Can't agree more. NVIDIA never gets paid. :D

KngtRider
19th November 2001, 11:21
they are spilling out more and more info by the day

http://forum.matrox.com/mgaforum/Forum8/HTML/000548-3.html

EDIT: if you read the prev pages he says normal system/psu will be fine

also haig said 'mgi have no current plans for low end capture card' when asked if new card will do analog caputre

and a unknown person in one of the other topics said it will(probly) be g600/750 and will be in shops 2002/dx9

tell us something we dont know......

at leat matrox wont stuff up the launch and get bad press with poor drivers

Ant
19th November 2001, 12:11
Don't expect any more video capture cards from Matrox Graphics. Do expect high performance, high priced 2D/3D, with the usual disclaimer of course, if they have the resources left to get it to market.

Hati
19th November 2001, 12:31
Why wouldn't they have the resources?
Did they loose so many talented engineers to nVidia?
I am impatiently awaiting a new high end 3D/gaming card from Matrox but I dont know if I should wait much longer.
Looks like they have some promising chip almost ready. Hope it will come soon, because surely I won't wait until next summer (spring would be o.k.)
Hati

GT98
19th November 2001, 13:58
Originally posted by Ant
Don't expect any more video capture cards from Matrox Graphics. Do expect high performance, high priced 2D/3D, with the usual disclaimer of course, if they have the resources left to get it to market.

So this means no Marvel type products from Matrox?


Scott

borat
19th November 2001, 14:05
being the king in 3D always pays just so long as you are the king.
this means that you need highly featured kit with great drivers and blistering benchmarks way ahead of the compitition.
nvidia and 3dfx have both showed this to be the case when they released their high end cards.(untill 3dfx got complacent)

it's being second place that never pays.
if your second place then
number 1 you dont get to make hige profit margins on your ludicrously expensive high end cards as the compitition have faster cards for the same price and
number 2 you dont have the public opinion that your cards are the fastest so oems dont want to buy your striped down product en mass as joe public demands that it has the badge of the fastest player in town on its card.
it may be enough to have geeforce 2 pro performance for us dedicated few but if matrox wants to stay alive this card must be the best out there with loads of features and a high price tag.
i hope that the g600/750 name is wrong as this suggests that the card is less than 2 times faster than the g400 and if thats the case i do not think matrox will be with us in a years time which would be a crying shame!

this card needs to sell en mass in a cut down version for a cheapish price and sell in limited numbers in the costly high end varient if only to get the matrox name plastered across magazine and web pages as the fastest around.
good luck matrox and i hope you can repay our faith in you wit what we've all been waiting for.

Kruzin
19th November 2001, 14:08
I wouldn't expect to see any more "low end" capture cards, like the Marvels, out of the graphics division any more. This doesn't apply to mid-high end solutions (ala RT2x00, DigiSuite) from the video division though...

Ant
19th November 2001, 14:32
Nope, Video Products Group will continue with their vid cap hardware, just no cheap "consumer" boards from Graphics.

J1NG
19th November 2001, 16:03
No MARVEL type products? Oh darn it. :(

Now it means I have to find a Marvel G200 AGP or get the NEW Matrox 2D (with the Oh So Amazing 3D features) Card and the RT2x00 (that is Very Pricey)... :rolleyes:

Great, I'm gonna be spending in the region of a grand when the Graphics and (Maybe next version of RT2x00) Video card becomes avilable at the same time. :( Better start working on the excuses now so I know what to say when I ask to borrow the money... :p

J1NG
*sigh* remembers the good old times with the Marvel G200 PCI :rolleyes:

Unam
19th November 2001, 22:43
Don't expect any more video capture cards from Matrox Graphics. Do expect high performance, high priced 2D/3D, with the usual disclaimer of course, if they have the resources left to get it to market.

Ant, lots of hints floating around these forums so I got to ask, what do you know we don't?

Ant
20th November 2001, 00:12
I just seem to be getting a lot of very well informed email these days, I only pass on those rumours that keep surfacing repeatedly and I have faith in to be true :)

<i>Disclaimer: no Matrox employees were harmed or in any way mistreated during the production of these posts</i>

Peter Aragon
20th November 2001, 01:33
Originally posted by Ant
I just seem to be getting a lot of very well informed email these days

Me too :)
It informed me that Neil Evan Caminsky should clean his Nimda infected machine. I wonder why he kept the e-mail address of a newsletter group on his computer, wouldn't that be company policy not to keep a bulk email list on a workstation. Too bad no details got leaked about new things to come . :D

Well, no harm done Neal, be glad your IT-department got that e-mail scanning service running.

GT98
20th November 2001, 05:22
Originally posted by borat

nvidia and 3dfx have both showed this to be the case when they released their high end cards.(untill 3dfx got complacent)


I would have to dissagree with this statement. The thing that killed 3DFX was it buying STB, its inabitly to get out its next gen Card (think it was called Spector or something ghost related) that was in develpoment since the Voodoo2 Days (it almost happened too.....but Nvidia bought them before it happened), and the total lack of getting into the OEM market.

I think the only reason Matrox Graphics is staying in Busniess is that its making a killer profit off the G450/550 series since they been in production so long and like most electronic/computer related products the costs go down as the product ages..I can't wait for them to come out with a new product, but like everyone else is saying they need to come out with something thats faster then the comp to make a name for themselfs again like the G400. If they don't they be stuck in mediocorty then out of busniess in the Graphics department.

Scott

borat
20th November 2001, 08:18
scott in my opinion 3dfx could have continued to do business enen after they bought stb and even without oem support.
this is because they had a loyal fan base who would have been willing to buy another of their cards had it have been a good performer.
however there was only 16 mb of ram on the voodoo 3's and they could only do 16bit colour coupled with the fact that both matrox and nvidia had quicker parts out their which meant that sales werent what they could have been. also the voodoo 5 was very late and not as quick as the geeforce 2 in the benchmarks.
it was also underfeatured compared to the geeforce 2 and unfortunately the voodoo 5 6000 which had the potential to beat the geeforce 2 and would have been the only 128mb mainstream graphics board even today didnt make it out before they went bust.
matrox have survived nicely on there g400/450/550 sales but can you really see businesses buying slightly newer versions of these over the coming years? me neither as ati and nvidia are getting closer in the 2d game of late and offer more bangs per buck in 3d!
this is why we need a new supercard, so that matrox can restore public opinion in its name so oem's buy matrox and they stay afloat.

birdy
21st November 2001, 15:52
i dont know about everyone else but i would buy a new Matrox card that was 'only' as good as a geforce 2.
I have started to notice my g400 needs lower resolutions to play newer games more fluidly but my gf2 has handled pretty much everything at medium resolutions so i would be happy with that.
The complete lack of anything remotely new is bound to hurt matrox in the long run.

SmellYfeeT
22nd November 2001, 05:02
hehe, well too bad birdy, I don't think you will get a card like that? Matrox apparently didn't think being the king in 3D is that important after G400, so they stopped making a high end card , or maybe their plans were not so good so they scrapped them. But I know that the next card will go head to head with ATI and Nvidia, meaning it will be faster than both their flagship products at this time. The Parhelia or whatever it is called, was suppose to be comming out around this time! so if it was out now, it could totally KILL ATI and really heart Nvidia ! That's why I shorted Nvidia stock at the end of summer and I am loosing way too much!! I have also heard rumours that Matrox beleives that more speed and performace will be unnecessary after that point at this time! So the product after might not be the next killer beast they will do the same as they did with G400, make as much money as the can then spend more for another design!

Hati
22nd November 2001, 06:50
SmellYfeeT:
so I guess they are holding back the chip because it would break their heart if they killed ATI and severely hurt nVidia.
They must be waiting for their opponent's next gen chips to give them a chance to survive!
Very gratefull!:confused:
:D :D :D
'Hati

borat
22nd November 2001, 10:41
i have to agree that after the next card wich matrox (might?) release performance will not be so much of an issue.
i think this because in order to be more demanding on the graphics cards games and 3d programs need more complex and lengthy code which will mean more people to develope a game andlonger developement times and ultimately higher cost games.
people will not pay much more for games hence they will not get much more complex therefore graphics cards will not have to be updated that frequently.
i believe that once this point is reached the emphasis will be placed on features and the ability to generate good image quality at fast speeds with these features.
thats just my opinion though!

mdhome
22nd November 2001, 12:12
I'm sure anything that Matrox has available will be slowed significantly when running games such as UT2 and Doom3. These games and their visual rich game engines are about a year away. Unfortunately I want it now :)

regards MD

TdB
22nd November 2001, 15:48
perhaps matrox thinks that their parhelia-card will be fast enough FOREVER for ALL games :D

seriously, all we need is constant framerartes=refreshrate, #polygons=#pixels, and even if we talk about advanced features like raytracing, then at some point in the future, the bottleneck will ultimately be the monitor nomatter what, and that doesn´t sound like science-fiction to me.

and even in science-fiction movies, the monitor never exceeds 64"
atleast monitors located indoor :p

Indiana
22nd November 2001, 16:01
Originally posted by TDB
perhaps matrox thinks that their parhelia-card will be fast enough FOREVER for ALL games

If you look at their "recent" offerings ("recent" in Matrox/Amiga/BitBoys time-continuum, this is where time advances about 5 times slower than in the universe we know), Matrox seem to think that the G400 is fast enough forever for all games :eek:

But I'd like a pair of those 64" monitors as well, we might need them to be able to read the text in a future where NVidia is the only company producing gfx-cards... ;)

Tempest
22nd November 2001, 17:11
I have to disagree with borat. The graphics business might seem quiet for now, but new revolutionary technologies are just waiting to emerge. New innovations (some might even be developed by Matrox themselves) allow gamemakers to quickly create complex 3-dimensional models, and once a complex model is ready, it can be reused infinitely. New memory technologies and graphics with 64-bit colours are knocking at the door. If Matrox thinks that it can release a new chip now and the HoloRunner add-on five years later then the new chip had better have support for at least 3 add-on processors and fully programmable WorldCasting&trade;.
But I have faith in Matrox. I believe that they have learned from the past years and will from now on ensure that new technology is adapted at a sufficient rate. Consumers are starting to confuse Matrox with Maxtor, for chrissake. :mad:

mdhome
22nd November 2001, 23:03
Graphics cards will never be too fast. There isnt a single one that can do 1600*1200*32 at 60hz with full AA on the newest games. In 10 years time, we may even be near to real time Shrek quality but it still wont be fast enough. All I can say is Roll on the holodeck.

Regards MD

Hati
22nd November 2001, 23:05
I guess it's too optimistic to think a new chip will serve the needs for the next couple of years. Development in this sector is at a high pace (DX9/OGL2.0 already beeing worked on), and being 'king of 3D' will never last longer than a few months.
I am hoping that Matrox will present some very innovative technology with a new chip soon. But some comments here (e.g. from Ant) make me wonder if Matrox is pushing works in this respect. After the Bitboys seem to be stopped by Infineon I hope Matrox will be the next to try some kind of revolution with their next gen chip. But until now all we have are some rumors.
Too sad people who might know more are not allowed to speak because of their NDAs.
Still putting my hopes on Matrox !
Hati

KeiFront
23rd November 2001, 05:56
Hati I hope you were refering to this quote from Ant.


Originally posted by Ant
Don't expect any more video capture cards from Matrox Graphics. Do expect high performance, high priced 2D/3D, with the usual disclaimer of course, if they have the resources left to get it to market.

We should expect high performance in 2D and 3D otherwise no Matrox for me anymore :( . The pricetag isn't that important for me (for some it is) I had 3 years to save ;) . Once bought a Nvidiot great 3D but 2D sucked big time it took me two weeks to switch back to my Max :D .

Hati
23rd November 2001, 06:07
Well, what makes me worry is the last part :

...if they have the resources left to get it to market.
Maybe Ant could explain why he is not so sure about this?
Hati

KeiFront
23rd November 2001, 06:15
If they can't get something to the market, because of lack of resources we got ourselfs a new 3dfx.

A dead graphics company :( . But I hope Ant can clearify his quote.

typo

TdB
23rd November 2001, 07:28
in 5-6 years from now, i would exspect 3d-performance to be were 2d-performance is now:

that is, being unimpotant, because it is just fast enough for everything.

i can´t renember last time i ran a 2d-benchmark, i really can´t.

5-6 years ago, we couldnt run in the res/colors we wanted, because of the graphic-card, now the monitor/eyesight is the bottleneck.

im only talking about speed, offcause

borat
23rd November 2001, 08:18
tdb you have a valid point i mean who can really honestly say that they require 2Ghz of processing power from their computer apart from number crunching professionals?
the fact is hawever that despite this processors are getting faster and selling still so may be graphics cards will also get faster even when we dont need the speed.

Novdid
23rd November 2001, 11:06
No I don't agree.

Say that in 5-6 years most of us run our games in 2048x1536. Then let's say that monitors doesn't run any higher resolutions, then the importance of FSAA becomes apparent. Remember 4x FSAA requires 4x the fillrate (todays technology), and in 5-6 years we will probably have 16x FSAA as a minimum setting (just a guess).

My friends, 3D performance is not like 2D performance. If the future chips are soooo fast that they are limited by our monitors, then the programmers will add various ways of bump mapping, enable FSAA as a default setting in the games just as all of the games today use Bilinear filtering, and use obscene amounts of polygons (wich will need a very beefy CPU) to create more or less lifelike games. You see were I'm getting at. 3D graphics can evolve endlessly.

2D graphics haven't reached the top just yet either, just look how the various effects in WinXP slow all the cards out today...

TdB
23rd November 2001, 15:09
even with fsaa, there is a maximum number of pixels that needs to be rendered, and that maximum is dictated by the monitor, and if the the 3d-engine uses more onscreen-polygons than pixels, then it is just poorly programmed LOD(Level Of Detail).

basically: if a monitor has N pixels, then the time it takes to render a frame is proportional to N, because each pixel takes a finite time to render.

and 16xfsaa doesn´t need 16xspeed, the newest rumors say that on tiler, fsaa can be done for free(almost).

Indiana
23rd November 2001, 15:38
Besides I'm not a number crunching professional, but I still could use something faster than my current CPU (AthlonXP@1.65 GHz ~ XP2000+). If you do lots of mp3 encoding and vidcapturing with editing and recompression (MPEG-2 or DivX) afterwards you need one hell of a fast CPU and a hell of a large and fast HD. I wouldn't mind having an AthlonXP@10GHz combined with a 100GB+ RAM-based "harddrive"...:D

2D-speed: I can see a difference between some of todays cards (no, not only with XPs fancy [crazy?!?] effects) and I think it's quite a pity that 2D-speed has become that forgotten. Since some cards (Kyro, G450) are dogs with hires and large windows but especially when using overlays.

Guru
23rd November 2001, 16:39
2D graphics haven't reached the top just yet either, just look how the various effects in WinXP slow all the cards out today...

Like I have asked before why use a OS that looks like a fvcking Picasso?

leech
23rd November 2001, 20:54
Get it straight, WindowsXP looks like a Fisher Price toy. At least the default theme. I use the Aqua theme. go to http://www.themexp.org for all that fun stuff. I can't stand the default theme. Looks, like I said, like a fisher price toy.

Leech

mdhome
23rd November 2001, 22:36
and 16xfsaa doesn´t need 16xspeed, the newest rumors say that on tiler, fsaa can be done for free(almost).

TDB

Show me a card that can do FSAA 4 for free. Because I dont think any of the current tilers can


regards MD

Novdid
23rd November 2001, 23:48
TDB, you get me wrong here, I was saying IF our future monitors can't run resolutions above 2048. Which they will do...

16x will never come for free, but will maybe not use 16x fillrate though. I do get your point on the pixels vs polygon thingy, but as I said, you can almost do anything in 3d graphics. I can mention one thing: enormous textures, only that would slow down everything tremendously while providing a significant boost to imagequality, and here memory bandwidth will be important.

Perpixel realtime rendered lighting, would be cool, fog and explosion effects done in awesome detail...I could go on forever.

We will have faster graphics weather we like it or not.

JF_Aidan_Pryde
24th November 2001, 00:00
MD,

The Gigapixel chip could do FSAA for free using Multisampling as it generated free sub-samples for every clock cycle. It was demoed in closed doors with a G400Max; both cards had roughly the sample FPS on the counter, but the GP card with 4x MSAA enabled. :)

JF_Aidan_Pryde
24th November 2001, 00:04
Having any resolution above 2048 is rather pointless for the common consumer. The visible difference between 640x480 [TV, which looks rather *photorealistic*] and 1024x768 is HUGE, but how many people can honestly appeciate a increaes from 2048 to 4096?

Our *eyes* are the limiting fact now. :)

What we need is 3D-holographic displays...say byebye to texture filtering.

Novdid
24th November 2001, 00:12
The screens will get bigger, then the sudden need of higher resolutions become apparent.

Example. 800x600 looks better on a 17" than on a 21" monitor.

superfly
24th November 2001, 01:23
I honestly do believe that within 4~5 years we'll reach the point where cards will be capable to rendering photorealistic images in real time.

I mean think about it,4 years ago we had the first consumer 3d cards on the market(voodoo 1) and in such a short period of time we're all the way up to GF3's and radion 8500's which are 25~30 times more powerfull than the original voodoo and that's just considering in terms of fill rate and to top it off have integrated 2d cores and hardware DVD decoding engines as well.


If the card makers continue on the very same pace,i can't even imagine how powerfull cards will become even in 2 years time ,let alone 4.

for example,there was a tech conference in taiwan some months ago in which the ceo of nvidia gave a general broad outline of where they intend to go in the next few years and while he didn't discuss any specifics on upcoming products,he did state the following ...

By the end of next year,it will be possible to build graphics chips with over 100 million transistors at over 300 mhz at 0.10 micron desing rules.

But here is where it gets interesting...

By 2005,there will be chips on the market with over 300 million transistors running in excess of 750 mhz.

gf3's and radion 8500 have 60 million transistors,can anyone here honestly imagine what 300 million can be used for?...

The word word that come to mind,at least for me,is EVERYTHING.....:D.

TdB
24th November 2001, 09:48
The screens will get bigger, then the sudden need of higher resolutions become apparent.

why would we want bigger monitors??? :confused:

a bigger monitor on a workstation would be impractical and useless, if the monitor is so big then we would need a seperate table for the monitor, because otherwise you would be to close to it.

i can see the point with a big TV, because you don´t sit right next to it.

multiple monitors however, is a good idea.

orangejulius
24th November 2001, 12:16
"The <a href="http://www.pc.ibm.com/us/accessories/monitors/index.html">IBM T221</a> monitor delivers outstanding visual performance. Packing over 9.2 million pixels into its 22.2in/564mm viewing image area, the T221 monitor displays exceptionally high-quality images for users in the medical, scientific and many other fields, anywhere the clarity of images is critical."

That's why. And it ships with a Matrox G200 MMS.

Hati
25th November 2001, 04:07
From rivastation:


Allerdings deutet sich auch von anderen Herstellern etwas an. Ein Matrox Projekt namens Parhelia - mit jeweils 4 Vertex und Pixel Shader Pipes und 19,2GB/s Speicherbandbreite mit 256Bit DDR Speicher (wenn´s denn stimmt...).

But, the headline was 'Gerüchteküche' (rumor mill)

Hati

KngtRider
25th November 2001, 10:50
hi alll

sorry to disapoint but the ATI 8500 is the reference platform and target platform for dx9

www.reactorcritical.com

my opinion

looks like either
a) ms didnt like the new mga chip/or crippled
b) timeframe wouldnt meet development for dx9

those rivastation specs LOL

256bit ddr wooo

19.2gbs means 300MHZ 256 BIT DDR, i calced it just now

Jorden
25th November 2001, 14:34
Good chance Matrox is trying to get into this list (http://www.stereographics.com/html/hardware.html) with their new upcoming card.

Then again, it would mean we all have to buy this monitor (http://www.stereographics.com/synthagram/synthagram.htm)... now who has US$6000 for me to spare? ;)

Jord.

Hati
25th November 2001, 23:26
I wonder if Matrox will have some kind of 'XBA' design. Some comment from Nappe1 back in October said:


What I heard about Matrox chip, it seems that it cannot be accomplished without eDRAM or VERY efficient tiler.

Hope it's not just rumors that ends up in M announcing a g575

:D

Hati

[GDI]Raptor
27th November 2001, 07:17
This looks very good, if this is true, Matrox might have a good card to release, but the question is when. Sometime in 2002 like Haig says is a very "wide" release date.......

We have seen roumors from the G800 some years ago, then they talked about FCDDR RAM, what spessifications does that momory type have? 256Bit DDR, to have 19,2Gb/s, you would need 300Mhz memory clock i belive....

I think the chip name Matrox Parhelia sounds bood though.. :cool:

atko
27th November 2001, 10:52
Originally posted by KngtRider
sorry to disapoint but the ATI 8500 is the reference platform and target platform for dx9

looks like either
a) ms didnt like the new mga chip/or crippled
b) timeframe wouldnt meet development for dx9
It is very interesting, because Radeon 8500 is a DirectX 8.1 card. I don't know what "reference platform" really means, but I think it isn't that card which will be the first DX9 compliant card.

rev
27th November 2001, 16:07
Originally posted by atko

It is very interesting, because Radeon 8500 is a DirectX 8.1 card. I don't know what "reference platform" really means, but I think it isn't that card which will be the first DX9 compliant card.

Hi, my fellow friend, let me help ya out with some facts about DX9.

Some days ago I checked out the meltdown2001 powerpoint presentations about future dx9 features. First, I saw dates like 2002 july for release DX9, but that seems so far too me, I wouldn't agree with it. Okay, let's see what've got.

We've got a full bunch of VS,PS enchancement like flow control, integrated VS and PS (like DX Graphics in DX8 D3D+DDraw) with same instruction set. More matrox specific is the Displacement mapping which will be support 100% by DX9 and it's a feature not announced by any other companies. I think it can be substituted with VS code, but supporting it with a seperate part of the hardware with a HUGE memory bandwith (and its supporting architecture) AND giving us developers and integrated VSPS2.0 (3.0? whatever) with 256 or more instructions, flow control, etc. would be heaven of heavens :)

(Just got to MS sites here's a link for Meltdown PPS files:)
http://www.microsoft.com/mscorp/corpevents/meltdown2001/presentations.asp

What I am talking about is:
DirectX® Graphics Future -- DXG9.ppt (439KB)
Speaker:Brian Marshall - Microsoft Corporation

Of course you should check this out too:
Displacement Maps As A New Geometric Primitive -- Matrox_Displacement.ppt (1130KB)
Speaker: Juan Guardado - Matrox Corporation
(It might already mentioned in a previous post, sorry then)

Okay, DX9 has HOS as main feature (High-ordered surface), but half of the presentation is about DM. I think it's quite informative what microsoft thinks about future 3D technology, matrox's presentation's title says:

DMs as a new geometric primitive

quit interesting, huh? :) Checked out too matrox presentations, here are my comments on that:

Juan's presentation has a Slide containing 3 maps, and a plane with 8 faces. And then, a picture of the result, which looks like 50000+ faces, has bump mapping, etc. "ngredients": Height data, Normal Map, base mesh (really loooow poly) and some tesselation info. It says this feature needs really high bandwith, which is quite logical if you think about generating most of the vertices OTF, not just transferring from some kind of buffer from AGP. It has some distance-based LOD too, quite interesting too, and has N Patch interpolation which is like ATI's TRUFORM for me, correct me if I'm not right. (that's for rounded objects, linear is for terrains, etc. as Juan stated). LOD works on N patches too, everything is dynamic, you can control tesselation level with the base mesh (always plane or what?) by putting more vertices at parts of the mesh (plane?), VS gets DM info, says something about high compression rate, 44:1 (unsure about that), and showed up a demo about it, and they might have a MAX 4 plugin to preview all this stuff, it would mean for me that they might have an engineer board already...

okay, that's all for today, have nice time replying to me :)

rev
27th November 2001, 16:10
integrated VS and PS will be around DX10, sorry for that, but DM is really close to reveal!

mdhome
28th November 2001, 00:49
Hello Rev
Firstly, Welcome to the site.
Secondly, thanks for the info. Unfortunately it might take some time sinking into my head.:)

This is the sort of detail I go to Beyond3d for. I dont promise to understand it but I am glad that someone else does :)

Regards MD

ps dont suppose you know any good simple 3d guides on the internet.:)

atko
28th November 2001, 12:56
Welcome to the forums of MURC, rev! :)

I've already read those presentations you're talking about, and I've discussed it with my friends (who know much more about 3d programming than me) here in the university, and I know that displacement mapping is a must to have feature in the next generation 3d accelerators, and I hope that Matrox will be the first video chip manufacturer who will support this feature, but unfortunately Matrox nowdays is not that kind...
Displacement mapping has got huge advatages but we need a very powerful chip to tesselate ten thousands of verticies, and this chip needs integrated memory (edram?) to store the vertices after the transformation... It would be very nice to see, that Matrox has got such a hardware, but be realistic...

rev
28th November 2001, 15:14
Originally posted by atko
Welcome to the forums of MURC, rev! :)

I've already read those presentations you're talking about, and I've discussed it with my friends (who know much more about 3d programming than me) here in the university, and I know that displacement mapping is a must to have feature in the next generation 3d accelerators, and I hope that Matrox will be the first video chip manufacturer who will support this feature, but unfortunately Matrox nowdays is not that kind...
Displacement mapping has got huge advatages but we need a very powerful chip to tesselate ten thousands of verticies, and this chip needs integrated memory (edram?) to store the vertices after the transformation... It would be very nice to see, that Matrox has got such a hardware, but be realistic...

Okay, let's see, matrox talking about DM... Why don't you think this 4 years of silence was long enough to develop something that kicks ass? Matrox always had the money for RD from the OEM section and they might had to slow down because of the not so big success of G550, or whatever reasons they had, but they must be on the right way to get back in to the competition along the release of DX9... There's too much fuss going around big M, and they are too silent... Soon, everything is going to be about DM, DX9 and DM+DX9 we'll be equal with Matrox Technology. Hopefully. :p

anyway, I think too that DM rocks and it has to bee somehow next gen cards feature. I think it could be implemented somehow with VS, but that's gonna be too slow for a while...

mute
28th November 2001, 18:00
well i think all that time spent was to increase the bandwith available , in the presentation they state that it takes a lot of bandwith .. with recent rumors stating the new card will have 19,3 gb/s it may very well be true ..

JF_Aidan_Pryde
28th November 2001, 18:27
4 Years is a all or nothing number in the gaming/hardware industry.

Mechwarrior2, Falcon4 and Max Payne turned out,

3dfx Rampage, Bitboys didn't.


Oops, I'm comparing games to hardware..rant on. :D

[GDI]Raptor
29th November 2001, 00:09
If the roumors about 19,2Gb/s memory is true, then Matrox would probably have hardware that needs this momory speed. I am prety shure Matrox will release a new chip in 2002, but the question is still: when?

I have herd on MSDN that DirectX 9 will be released this spring, si if Matrox have a DirectX 9 card, spring will be a good time to start shiping it. Wasn't it "spring 2002" the "drunken" Matrox rep at the AMD conferance said that Matrox would be "king of 3D"..?

KngtRider
29th November 2001, 00:33
Hi all

Read the wording of Haig's post very carefully

http://forum.matrox.com/mgaforum/Forum8/HTML/000703.html

Also he is reading this thread ;

[GDI]Raptor
29th November 2001, 00:42
Yes, but the most important thing is that he confirmes that a new card is on the way. G550 wasn't optimised for OpenGL, and since the "new card" will be, it's a sign that it will be a fast card.

SteveC
29th November 2001, 03:12
Originally posted by [GDI]Raptor
Yes, but the most important thing is that he confirmes that a new card is on the way. G550 wasn't optimised for OpenGL, and since the "new card" will be, it's a sign that it will be a fast card.

Was the G550 optimised for <I>anything</I>? ;)

Tempest
29th November 2001, 03:20
Originally posted by KngtRider Also he is reading this threadhttp://koti.mbnet.fi/~sampokoo/misc/hihaig.gifOkay, so things seem to be quite OK for now. A new card is apparently in the works, and if everything that has been said in this thread would be false and the new card would be a G650 with HandCasting&trade; (for deaf people) Haig wouldn't have posted a link to this thread... I think?

KngtRider
29th November 2001, 10:19
ahhh

i get you know tempest :)

now that i think of it your right, either he is keeping tabs on what rumours are going around(doesnt everyone) or posting the link was a subtle hint

EDIT: lol at logo, make one for carmack and billy g too :)

how did you do that anyway, manually in paint prog? want for another forum

Haig
29th November 2001, 11:55
A new card is apparently in the works, and if everything that has been said in this thread would be false.......Haig wouldn't have posted a link to this thread... I think?

There's also an interesting thread going on here:

http://forum.matrox.com/mgaforum/Forum8/HTML/000548-3.html

:D :D :D

Haig

Haig
29th November 2001, 12:19
Seems valid to me:)

Ali
29th November 2001, 14:42
Ok, Haig says we can speculate about:memory type, amount of memory, price, amount of bandwidth, amount of outputs, name, extra goodies, etc

I would say DDR memory, because its cheap.

Probably 256bit, with slightly slower speed (200Mhz ish, rather that the fastest available).

Amount of Ram: 64, 128. Dont see the point of less than 64 in a new card, any more that 128 would be an extra cost for very little gain.

Amount of bandwidth: Lots and lots. Probably a bigish (8-16MB) of embeded memory to act as a cache for the external memory.

amount of outputs: 1VGA, 1DVI, 1 USB2 and 1 Firewire. with an adapter to convert the DVI to VGA of course. Possibility of a single connector on the card, then a great big breakout cable with all the connections you could think off.

Name: something to do with Fusion, or Piariah (sp?).

Extra goodies: well, DM is a given I would say. Probably a fast T&L engine (I heard 80million triangles/sec somewhere). Head casting will be there, since they spent all the money on it. There is sure to be something interesting, that nobody else has thought of. Matrox always does that.

Ali

DukeP
29th November 2001, 14:55
<Rant>
One guy posted in the matrox fora, that the high ramdac speed is about to become obsolte, with the DVI standard.
He might be right, in time. As of now, the DVI standard is not up to scratch. Far to limited in bandwith. I dont know what they have been thinking, when they designed those specs (for the DVI). Dell have just begone selling lovprice (ish) 20" TFT monitors. Not sure if they can run in highest rez via their DVI port. If it can, this must at least be very near the top of the specs.

</Rant>.

Plz keep 2D quality at the foremost!

~~DukeP~~

Novdid
29th November 2001, 15:09
256bit memory is not going to be cheap.

In the long run I think they rather go with 400MHz 128bit DDR(maybe not possible today, but maybe in a couple of months), than with 200MHz 256bit DDR.

Tempest
29th November 2001, 15:50
Originally posted by Ali
Name: something to do with Fusion, or Piariah (sp?).Piariah=Parhelia? I don't think that Matrox Pariah would be a very selling name... (Just kidding :))
I would like the final product to have something to do with the sun. If the technology is codenamed Parhelia (sundog), the final product might as well be named Ra (Egyptian god of the sun) as faked by me (http://koti.mbnet.fi/~sampokoo/misc/mgara.gif) a while back...
The righteous name would of course be Matrox MGA Millennium G800 Parhelia Phoenix MAX :D
But I don't think the name will be G-anything because it would lead one to believe that the card is just another product based on the old core (which I hope it's not).

Ali
29th November 2001, 16:56
256bit memory has nothing to do with the actual chips, just the bus going to them (as far as I know).

The only thing that would cost more is the PCB itself.

Think of the nForce. It only uses normal DDR Ram, which is usually 64bit wide, but they have a 128bit bus if you use two or more sticks of RAM. Therefore your 'old' 64bit ram runs at 128 bits.

And yes, Piariah=Parhelia I was just too lazy to look back to see how to spell it :)

I also forgot about price. Realy, I dont care. The G400 lasted so long, that if the card is advanced enough to last 2 years, it could cost the same as 4 nVidia cards and come out even ;)


Being serious, I would expect it to be expensive, probably around to US$400 upwards.

Ali

superfly
29th November 2001, 20:21
It's not just extra PCB space,it's mostly adding adding all those extra signal traces between chip and the memory itself and making sure that all those extra lines are properly routed to allow for the large clock speeds of todays DDR memory,currently at 332 mhz(3.2 ns) ...

Just on the graphics processor itself,it would add quite a large number of extra pins to the packaging itself...

let's take an example like the GF3,even though it still uses a 128 bit bus between the graphics chip and memory,it has about 150 extra pins over the gf2 that also uses a 128 bit bus itself and the same type of memory that's clocked at the same speed....

Those 150 pins(which total about 700 pins) are there because the GF3 splits that 128 bit bus into four seperate 32 bit buses,one for each pipeline(crossbar tech).


Making that bus into a full blown 256 bus would add even more pins that the above example and there would also add memory granularity issues as well since the memory module packaging would have to be larger to add the extra pins as well.

So there's alot of issues to consider when moving to a wider bus,otherwise we would have seen the transition happen by now.

Novdid
30th November 2001, 01:31
Right on!!!

Lemmin
1st December 2001, 08:07
I'm not sure about the embedded ram... you need a lot for it to be worthwhile, and even if you put enough in for todays games, if someone releases a game in the future that is texture heavy, you lose most of the benefit. Also, embedded ram makes the chip die much bigger, hence you get less per wafer and cost goes up. Better to concentrate on fast bandwidth to the main memory IMHO.

How about some different (non-bruce force) rendering techniques? Tile based and defered rendering like the powerVR chips? Or is that technology patented?

Also, just had a mad idea which is kind of off-topic, but would it be possible to have an asynchronous (non-clocked) graphics chip? Obviously it would have to interface to clocked memory... dunno if there would be any benefit to going asynch for a graphics solution.

LEM

Novdid
1st December 2001, 09:33
some embedded ram never hurts, even if it's a very small amount. It would work like the cache on a CPU, but in this case to store triangle data for example.

Ali
1st December 2001, 14:20
Superflys post is quite good, explains a lot about why 256bit busses arnt already in use. I didnt realise about the extra pins needed for the cross bar controller. I thought they were there in case ATI bet Nvidia, so nvidia could release a 256bit version of the Geforce3 to get back on top (thats what I thought the Ti series was going to be).

Anyway, if Matrox adds some embedded RAM, that would make the chip bigger, therefore there wouldnt be that much of a problem fitting more pins on the bottom for the 256bit bus.

It would cost more of course, but most of us Matrox users tend to have nice big monitors that cost a lot anyway, and dont mind spending more money on a nice grapics cards (well, thats my feeling anyway).

If the next Matrox card is good enough, I might consider getting rid of my old 17inch secondary screen, moving my 21inch over to secondary, and get one of those 20 inch Dell flat panels for my primary. I dont play games as much as I use to, so any slight delay probems in the flat panel wouldnt matter.

Ali

superfly
1st December 2001, 16:17
Embeded ram could be made to be very fast since the memory could be made to run as fast as the graphics processor itself and the bus could be made as wide as needed,256 bit wide or even 512 bit woudn't out out of the question since it's all integrated in the same die.


But there's disavantages as well..for one there's the extra die space it would take up to put a meaningfull amount of it,even if the memory itself can be packaged much "tighter" than logic circuits..most high performance cards out here can already run certain games at 1600*1200 32 bit at pretty decent frame rates which by themselves take up 30 megs of frame buffer(1600*1200 32bit,triple buffered).

The only way out of that situation is if the chip maker somehow managed to implement frame buffer,texture compression and vertex compression routines so that the max amount of data could be stored in the embeded memory and have extremely efficient prefetch circuitry that would minimize the performance hit when the chip would have to get it's data from the card's main memory.


Texture and vertex compression routines already exist in the DX8 spec,but i'm not sure about frame buffer compression routines,so those would most likely have to be implemented in hardware in such a way that the card would do it on it's own without intervention from game developers and would be API imdependent(either opengl or direct 3d would work).


Then there's the fact that by having that extra memory,the chip's feature set or it's implementation is sacrificed in order to make the chip at a reasonable size and cost as well.For instance any given chips maker can either have a vertex shader with let's say,5 million transistors if some of the die space is already used for embeded memory or a vertex shader that could have potentially 10 million of there wasn't any embeded memory on the chip,so it's a difficult decision to make since you could either use the "weaker" vertex unit with embeded memory and potentially use all of it's performance or have a stronger one without using embeded memory and rely on other optimizations to acheive the best possible performance from it,it's a tough call.


But as it is DX9 already brings improvements over DX8 in terms of vextex and pixel shader flexibility and adding displacement mapping will likely cost extra transistors as well so making a full blown DX9 card and adding a meaningfull amount of embeded memory to it would likely be a little too much to ask.IMHO.


I do believe that at some point in time,at least some chip makers will move the main bus to 192 bits,wich woudn't be nearly as bad as going to the trouble of a full 256 bit one,and provide quite a nice boost in bandwith as well since ould provide about 15 gig/sec of bandwith,use that with a system similar to what nvidia has done with their GF3 but implement it with six 32 memory controlers,one for each rendering pipeline which would require that 192 bit bus in the first place and those 15 gb/sec would look like 20+ in terms of real world performance.

One can only dream....:D.

2Whyzzi
1st December 2001, 18:44
and have Synchronous RDRAM or something similar with a highly optimized/pipelined core.

mute
1st December 2001, 22:32
RDRAM now , that would make the price go up ...

believe me

Novdid
2nd December 2001, 03:54
The high latency of RDRAM should make DDR a much better choice.

Maggi
2nd December 2001, 04:25
Wasn't there mentioning of FCRAM in one of the older rumours ?

Just for comparison's sake, I'd love to see that one used ... :D

Maggi
2nd December 2001, 04:37
<table border cellpadding=7>
<tr>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva"><b>DRAM Type</b></font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva"><b>PC100</b></font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva"><b>PC133</b></font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva"><b>DDR</b></font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva"><b>RDRAM</b></font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva"><b>FCRAM</b></font>
</font></font></td>
</tr>
<tr>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">Clock speed (MHz)</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">100</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">133</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">133</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">400</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">133</font>
</font></font></td>
</tr>
<tr>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">Data rate (MHz)</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">100</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">133</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">266</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">800</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">266</font>
</font></font></td>
</tr>
<tr>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">System data bus width</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">64-bit</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">64-bit</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">64-bit</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">16-bit</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">64-bit</font>
</font></font></td>
</tr>
<tr>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">Peak Bandwidth (MB/sec)</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">800</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">1067</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">2133</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">1600</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">2133</font>
</font></font></td>
</tr>
<tr>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">Bus Utilization</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">62%</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">59%</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">42%</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">74%</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">55%</font>
</font></font></td>
</tr>
<tr>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">Max. Effective Bandwidth (MB/sec)</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">494</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">631</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">897</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">1190</font>
</font></font></td>
<td valign="TOP"><font face="arial, helvetica, san-serif"><font size=3>
<p><font size="2" face="verdana, arial, helvetica, ms sans serif, geneva">1165</font>
</font></font></td>
</tr>
</table>

found here (http://www.toshiba.com/taec/components/Generic/WP_memory.shtml)

:D

mdhome
2nd December 2001, 05:00
Maggi, silly question but what is the difference between the two.

regards md

KeiFront
2nd December 2001, 08:03
FCRAM has low latency and high bandwith.
RDRAM has a high latency and high bandwith.

FCRAM is ideal for graphical use, much lower latency than RDRAM.

superfly
2nd December 2001, 08:13
Rdram would be inadvisable since like others have mentioned it still has the high latency disadvatage going for it in it's current implementation,however since the rdram spec will undergo changes in the medium term,from 16 bit to 32 and even 64 bit versions are planned and even higher clocked mhz are in the pipeline are in the works,it could very well be used in graphics cards within the next 2 years.

A 2 channel 64 bit pc 1200(600 mhz effective) rdram solution would provide about 20 gigs/sec of bandwith.

Novdid
2nd December 2001, 08:16
Very interesting graph...FCRAM seems like the best choice not a doupt.

Indiana
2nd December 2001, 09:02
Besides Rambus is about to lose the DDR SDRAM court against Infineon as well and most probably will finally be dead for good.
Without Intels support I seriously doubt that there still will be any RDRAM in the market in two years...

Maggi
2nd December 2001, 11:39
Originally posted by mdhome
Maggi, silly question but what is the difference between the two.

regards md

not silly, but lazy ... :p

if you'd followed the link I posted above, you'd found:


FCRAM - A Faster Memory Core
All of the DRAM types commonly discussed in the industry today, such as EDO, SDRAM, DDR and RDRAM, have one major thing in common ñ their memory cores are the same. What is different about each type is the peripheral logic circuitry, not the memory cell array. What this complex new peripheral logic circuitry does is attempt to hide the inherently slow memory core.

FCRAM is a novel concept, which finally recognizes and fixes the slow memory core by segmenting it into smaller arrays such that the data can be accessed much faster and latency greatly improved. How this is done is beyond the scope of this paper. If the reader is interested, both Toshiba and Fujitsu can provide more detailed information on FCRAM functionality.

The key measure of how FCRAM improves latency and can improve system performance is the read/write cycle time (tRC), which measures how long the DRAM takes to complete a read or write cycle before it can start another one. In the case of conventional DRAM types, including SDRAM, DDR and RDRAM, tRC is typically on the order of 70ns. With FCRAM, tRC of 20 or 30ns is possible. For this reason, this new device is referred to as a fast cycle RAM.

Besides faster tRC, FCRAM also improves latency with several new features that will be discussed below.

like the others already said, it has improved latencies

superfly
2nd December 2001, 11:45
I doubt that Rambus will go down in 2 years time since if nothing else,the PS2 uses rambus memory and that alone will keep the company alive,considering the amount of PS2 consoles already sold and the fact that the PS2's replacement probably won't be released for another 2~3 years at least..

KngtRider
2nd December 2001, 14:00
Creative labs used 4 and 8mb RAMBUS on some of their 'early' cirrus logic cards (3d as in crapo virge 3d, 1997)

they flopped

you can look at the info in the tech info section on creative asia site

SmellYfeeT
2nd December 2001, 15:29
Very INTERESTING discussion going on, seems like the thread is becoming a hardware discussion forum :)
SuperFly brough up many good points on why going with 256bit is technically challenging , and that is not the case only for graphics card. There are many video processing chips as well that have the same issue, the BIG part of the problem is not having to many pins, but in fact lies under the packaging of the silicon, where you have all these thin gold wires connecting the die to the package. and when you cranck up the frequency on so many wires so close to each other you get severe Trasmission Line problems. One of the solution to this problem is the FlipChip technology that connects the die pins somehow directly to the pacakge and indirectly to the PCB. Which ofcourse is more expensive. So it is not Impossible to do this, but rather expensive. I don't know this as a fact, but the Parhelia design has a 256bit memory bus, however the way it works is different and MORE efficient than Nvidia's Cross bar technique.

SmellYfeeT
2nd December 2001, 15:44
by the way, regarding the discussion on the embeded DRAM, I remeber 2-3 years ago back in University, I heard from a classmate of mine that either ATI or Matorx(can't remeber exactly) was working on a design with a memory manufacturer at that time, to include parts of the pixel operations on the RAM <this is the otherway around> I guess that idea got canned somehow, cause I don't know of any card that are built with that technology.
Another issue with embeded rams are also LOW YIELD rates of the silicon, which does not make it an attractive option to graphics chips. Also there are technical reports about this that shows large amount of Cache will not improve the performace of hardware components that perform mathematically intesive operation if they have low memroy speed. So my guess is expect a very high performance memory system with normal on board memory. should we start the guessing game on memory frequency, my guess > 350Mhz ;) no really, it will be very high!

PS. I am very disappointed by now that Matrox has not announced anything! Therefore, we definitely won't see the card by March 2002! I will bying an 8500 all-in-wonder before Christmass now that it is out :( (go ahead Boooo me...)

mute
2nd December 2001, 16:09
They're new card will be out right after DirectX 9 goes public..

and that's 100% SURE .. if they don't can it ..

Unam
2nd December 2001, 16:11
Might be missing something obvious, but what would prevent Rambus from widing the data path and cranking up the clock speed of RDRAM? I've read a few online articles that state they are going to be doing just that for some of Intels future chipsets. Surely that would be cheaper than FCRAM...

superfly
2nd December 2001, 16:44
That's exactly what RAMBUS will be doing within the next year,PC 1066(533 mhz) will be available when intel introduces p4's running with a 533 mhz fsb by mid year using a 850 chipset certified to run at that speed(133 mhz x4).


The main question is if pc 1066 will still use a 16 bit bus or if Rambus will transition right away to 32 bit and if they do,will intel still use 2 memory channels,which would total 8.4 gigs/sec of bandwith or stay with only one at 4.2 gigs/sec.


Just a question though,doesn't the gamecube also use rambus as well,because i seem to have read somewhere that it does.

KngtRider
2nd December 2001, 17:18
the next nvidia mobile gpu has embedded ram, nvidia's fabs (TSMC+UMC) get/will get high yields out of those chips, and NEED to as many OEMS use that gpu, DELL, toshiba etc etc and shipping many units of those notebooks around the world

EDIT: gamecube doesnt use rambus, n64 does

www.icrontic.com < gamecube

http://www.icrontic.com/gc5.htm <processors+ram

the last time i seem MoSys Ram used was on the ET6000 and needed to be special parts to suit that gpu

note the CPU and GPU have ©2000 on them though the unit is a 2001 build, 1yr lag for registering of intellecutal properpty>shipping, so can use that approx on matrox since its a new and big new part

CHHAS
2nd December 2001, 23:53
I just hope that we'll see a good 3D performer from Matrox soon, this Gefart 2 Pro is killing me, maybe I should put the G400 back in and play in lower res, might even look better too :p

KeiFront
3rd December 2001, 01:32
Originally posted by Unam
Might be missing something obvious, but what would prevent Rambus from widing the data path and cranking up the clock speed of RDRAM? I've read a few online articles that state they are going to be doing just that for some of Intels future chipsets. Surely that would be cheaper than FCRAM...

FCRAM should cost around $10 more than DDR-SDRAM. That's still cheaper than RDRAM.

Does anybody know the ETA of DirectX 9.

CHHAS
3rd December 2001, 01:51
Heard some rumors about March.

tr4veler
3rd December 2001, 12:44
I though it was the PS2 that used RDRAM ?

Ali
3rd December 2001, 12:48
I still have stuck in the back of my mind that Carmack interview about 18 months ago saying Matrox has solved the bandwidth issue. That means that they must have had a working, or near working simulation at least back then.

Also, Matrox has signed up with ATI on the shader specs for OpenGL, rather than use the nVidea designed specs. That means a programable T&L unit.

Amiga said Matroxs next product would be kick ass, and I assume they didnt meant he G550.

Haig has been dropping hints about the memory used being interesting.

There was an interview a while ago with a Matrox guy saying they needed to solve the Anti-Alising problem. Lots of bandwidth should do that. Huge supersampling would be best, as textures wouldnt get washed out like Geforce3.

Displacement Mapping is almost a given, and that needs huge amounts of bandwidth.

And lastly, Matrox always makes a balanced product. So the core will have enough pipes to feed all that bandwidth that we know they must have. Should be a VERY fast product.

Ali

[GDI]Raptor
3rd December 2001, 13:25
FCRAM looks like the best solution for Matrox. I realy hope they have fixed all problems. High Quality Anti-Alising had been very nice, whith a bandwich of 19,2Gb/s, it looks realy good. Displacement Mapping is also a impresive feature, and I think Matrox will use it in their chip, because they have presented it at several trade shows.

DirectX 9 is due in march/april, that is what I can read on the MSDN homepages.

I hope Haig can give us a hint on when we can see an announcment from Matrox, Sometime in 2002 is not an good answer... :cool:

superfly
3rd December 2001, 15:13
I think that ultimately,it isn't wise for any given chip maker to implement different versions of pixel shaders,vertex shaders or what have you so that that the company who developed it can boast about having a feature that other cards don't since all that does is create confusion with users and delays support from game developers,it's been proven time and time again.


So the real issue should be how well each feature is implemented in order to see who has the best image quality/speed compromise,not if the feature is actually there or not.

I know that won't go down well with each company's markting dept,but ultimately it's unavoidable if we want to see those features get used(and abused...:D),by game developers as soon as possible.


As far as antialiasing issue is concerned,from the admitedly few shots i've seen so far,the Radion 8500 does a pretty kickass job thank's to the variable algorithm they use since it keeps textures from looking washed out like the GF3.All it would need now is the anisotropic capabilites of the gf3(at least 64 tap preferably 128 tap) to make it look perfect.


Can't wait to see what ATI has up their sleeve with the R300 chip,which should be on the final stages of developement,but regardless it clearly shows that ATI is on the right path.

KngtRider
3rd December 2001, 16:09
ATI has/will always have D3D and GL problems, since the r6c its thair nature inherinet with the company, would be suprising if they fix EVERYTHING inc trilinear

100% chance displacement mapping is the new chip, no doubt. They have engineers working on that as shown by meltdown

http://www.nvidia.com/docs/lo/983/SUPP/detonatorxp.pdf

this document released with 23.10 dets shows the feature of dets XP inc GL caps supported by their ICD

doc shows what matrox need to implement on the new part

i learned 2 new things from nvidia pdf today

1) Panel Link and TMDS are trademarks of Silicon Image, Inc.

2) no support whatssoever for referece driver, and the board makers wouldnt be able to help in code optimisations cause they just custom the drivers not rewrite core

now the 1000 dollar question is will I have a new matrox card in MY system in 2002

good think though is that no one in AUS hw review/oc community likes matrox so i got a headstart to keep track of the chip and try to get aus exclusive heh heh

now the remaining issue, case badges

stuff the memory

we want matrox case badges again now

mII had 1

BTW n64 did use rambus

superfly
3rd December 2001, 17:24
Well let's give them the credit that the hardware behind the 8500 is pretty impressive,especially at the price these cards are going for and i'm pretty sure that ATi can't possibly be making a lot of money on them,especially when it's quite easy to find the cards going for well under 250$,something users haven't seen in a couple of years,at least when it comes to high end video cards.


And it can't be denied that they've already made quite a few improvements in the driver dept for a card that's only been available very recently,so i guessing that with a couple more driver releases,most of the issues that still remain will be addressed and with any luck we'll see the card operating at it's full potential.


And get this,according to ixbt,nvidia is supposed to anounce the NV25 by late january and ATI will apparently spoil it somewhat by launching even higher clocked versions of the radion 8500(300 mhz core,300 mhz memory).

But again,according to the site,they'll have two models available,the radion 8700 which is just the higher clocked version mentioned above and get this,the radion 8800 that also runs at the same clock as the 8700 but packs 128 megs of 300 mhz DDR sdram.

Damn,128 megs...until recently most users didn't have that much running their entire system,let alone just for the video card...:).

KngtRider
3rd December 2001, 17:50
you serious !?!?

wow thanks for the info

nv25 has iDCT btw (but no deinterlacer/per pixel bob/weave)

The PIT
4th December 2001, 00:28
Soon graphic cards will have enougth onboard ram so that agp transfer will never be needed.

BryceMan
4th December 2001, 12:03
Soon graphic cards will have enougth onboard ram so that agp transfer will never be needed.
I'm sorry, but this statement is just wrong. There will always be increases in requirements. Textures get larger and more detailed, every new technology uses some form of multi-texturing, it seems. You always have to get the data out of somewhere, and that somewhere is through the agp bus. I agree that bandwidth-optimizing technologies make the current trend of doubling the supposed agp bus speed every year pretty ridiculous, but saying that it will never be needed is just as ridiculous.

SmellYfeeT
4th December 2001, 15:03
nv25 has iDCT btw (but no deinterlacer/per pixel bob/weave)

So will the Parhelia :)

superfly
4th December 2001, 15:39
Well the use of even more onboard memory has less to do with textures since pretty much every card out there supports texture compression routines in hardware which can already save quite a lot in terms of memory footprint.


The texture compression routines built into DX can compress textures as high as 5 to 1,so even if we have any given card that has 64 mb of memory total and that half is already used for frame buffer and even then,only if the user decides to play his/her games at 1660*1200 32 bit,it still leaves 32 mb available for textures which if those compression routines are used,could make those 32 megs work like there was 150 megs(5 to 1 compression).


Now imagine the situation when a card has 128 megs of ram built in,i somehow have a hard time believing that particular card would ever be texture limited even if developers started to use texture resolutions as high as 2048*2048 or even higher than that since with texture compression,you could potentially have 250+ megs of textures and it would fit in the card's memory without having to texture out of the agp bus..


The higher memory amounts would be more usefull for things like higher quality AA at very high resolutions and using 64 tap or even 128 tap anisotropic filtering wich would use way more frame buffer or even using part of it as a vertex data cache for the graphics card,so that you don't overwhelm the agp bus with polygon trafic...

mute
4th December 2001, 16:37
and isn't s3tc 16 to 1 ?

not sure... guess so ..

superfly
4th December 2001, 17:02
Actually i think it's up to 6 to 1 compression for s3tc and i believe that it was incorporated into the dx spec without any modifications made to it.

There is one caveat though and that's that the spec suppports up to those compression ratios but there's a few intermediate ones between having no compression at all and the maximum that are there to support specific effects like translucent effects in wich the maximum compression ratio can't be used,since it would result in serious visual glitches.


But even in a case where you can't compress all the textures using the highest ratio,alot of memory can be saved even if developers were limited to using the intermediate compression ratios since it's not just the ratio itself that changes,it's the method used to acheive that ratio that's different too.

KngtRider
4th December 2001, 22:37
ok it will have x and x but cause of matrox's driver strucuture(no 3d adjustments, which is what ms likes) prolly wont allow adjustment of s3tc/dxtc

iDCT yummy, but it wont have per pixel bob/weave and deinterlacer like rage theatre chip

suprise me

IcedEarth
5th December 2001, 03:30
OK, texture compression is nice and all but do you realize that a compression ratio of 6/1 will result in lossy compression and thus in lousy image quality. The highest obtainable lossless compression ratio is about 2/1 but that's with 8-bit images.

Losless compression of 32-bit textures is likely very small so I don't see the point of compressing textures unless you're willing to give up image quality. But since we all have Matrox cards I suppose we all DO care for image quality so texture compression isn't really that important IMHO.

Snake-Eyes
5th December 2001, 08:32
Well, let's examine the lossless/lossy argument for a second. If we really HAVE to lose quality to get decent compression, why is it that I can take a TIFF file (lossless image format) from my scanner, ZIP it using WinZip (compression tool), and the file size is greatly reduced (this is a 32bpp scanned image)? Keep in mind that the decompression of the contents of the ZIP file results in an exact copy of the original picture. (Of course, I understand that certain image formats don't compress nearly as well, but those are usually formats that are ALREADY compressed).

Possibly the real problem is the current compression formats that are used, and not whether compression is a worthwhile task. Maybe Matrox would like to use a non-lossy format of compression, especially if they really plan to implement displacement mapping (hopefully along with all the other goodies like full vertex/pixel shading capabilities) and still have at least good overall performance and image quality..

Maggi
5th December 2001, 09:02
hehehe ... TIFF is a bad example, since you could also use JPEG compression within and that one is definitely not lossless ... ;)

Usually scanners provide uncompressed TIFF format or maybe even LZW compression, while the latter indeed is lossless.

Now imagine you'd scan a completely empty sheet of paper (= white) and store it in TIFF uncompressed. The resulting file size can be calculated by the following term
(amount of horizontal pixels) * (amount of vertical pixels) * 3 bytes for RGB (in case you scanned in ordinary true color = 24bpp) = file size of scan
now using LZW compression will result in a dramatically shrunk file size, since the document contains only one RGB value, but used for every pixel and that can be put into a formula like "use 255/255/255 RGB for 10000 pixels width and 10000 pixels height" or something to that extend.

An actual example would be a movie's end-title scroller that we just rendered in 3656x1976x48bpp (16bit per channel) and the uncompressed images were well above 28MB each, while enabling lossless compression in the SGI file format shrunk 'em down to 700KB - 4MB, depending on the actual picture's contents.

:D

It all boils down to the compression routine, ie. either it is lossless or not, plain and simple.

The compression ratio is a totally different topic, as my example above could illustrate, but as a rule of thumb one could say, lossless compression results in larger amount of data than lossy compression.

:)

Anybody got a clue what kind of compression is used in DXTC ?

IcedEarth
5th December 2001, 09:50
Originally posted by Ace
Well, let's examine the lossless/lossy argument for a second. If we really HAVE to lose quality to get decent compression, why is it that I can take a TIFF file (lossless image format) from my scanner, ZIP it using WinZip (compression tool), and the file size is greatly reduced (this is a 32bpp scanned image)? Keep in mind that the decompression of the contents of the ZIP file results in an exact copy of the original picture. (Of course, I understand that certain image formats don't compress nearly as well, but those are usually formats that are ALREADY compressed).


That's because a TIFF file sometimes is LARGER than a BMP file. Believe me when I say that 2/1 is about the best you can get; I've worked on a thesis about lossless image compression for two years :)

>>edit<<
For the compression ratio I'm talking about photo-quality pictures here. If you take a white sheet of paper the compression ratio will obviously be better than 2/1 :)

superfly
5th December 2001, 16:39
If i'm not mistaken,the type of compression used in DXTC is the lossy type regardless of which ratio and type is used since there are a few different methods within the DXTC spec....


I'll have to check to be sure though.....

[GDI]Raptor
6th December 2001, 02:15
I don't think texture compresion is the most important feature. I is important, but there are other beatures that are moe important. Efficent FSAA is more importent, and also some sort of hidden surface removal. AGPx8 is comming out soon, and if Matrox is going to release/announce ther card with DirectX 9 (March/April) (CeBIT maybe....) they should suport AGPx8 in ther card!

mute
6th December 2001, 04:29
Game dev conf
San Jose, California
March 19-23, 2002


tentative.. that's what matrox says on their site.. product launch probably.. after that it's E3

CHHAS
6th December 2001, 05:20
I'll believe it when I see it.

Maggi
6th December 2001, 05:29
Originally posted by CHHAS
I'll believe it when I see it.

are you sure ?

:D

Snake-Eyes
6th December 2001, 08:16
I knew my compression example stunk, heh.

I tend to look to texture compression as just another tool in the battle to make most efficient use of the onboard memory of the graphics card. Besides the need for the new DX features to be supported, I still believe that an efficient method of utilizing 64MB (128 soon?) of onboard memory without causing severe bottlenecks (all while using the features- without them it's almost a moot point) is going to make or break video cards in the near future.

Using system RAM as an adjunct for video storage (in the sense of textures and such) is an unworkable scheme for now, at least in the context of high-performance real-time 3D rendering (gaming oriented, of course). I refuse to count on any new AGP spec having a major impact on performance, after seeing the minimal impact moving between 1x, 2x, and 4x has had to date. It has to be kept under consideration that system memory isn't only being used and accessed by the graphics card, and therefore incur delays as other devices and the processor also make accesses (except in the case of memory controllers capable of making multiple simultaneous accesses, similar in concept to the TwinBank controller on nVidia's nForce and the memory controller that nV uses in GF3s).

All the discussion going on here (and especially Haig's comments) has me hopeful that Matrox might actually have something revolutionary (when compared to other consumer-level products, anyhow) waiting for us. The only thing I can say to that is- "It's about time!" (speaking as an early adopter and still present owner of a G400Max, and also as one that uses a current GF3 in my gaming rig, lol.)

Helevitia
6th December 2001, 12:13
http://www.eetimes.com/story/OEG20011205S0054

[GDI]Raptor
6th December 2001, 13:30
March would be a good time to announce the card, and ship it together with DirectX 9

A lot of events in spring 2002: Game Dev. Conferance, CeBIT, E3....

Let's cross all our finger for Matrox! My AGP slot is still waiting for something new.... :p

Wombat
6th December 2001, 13:47
I just read the last few pages of this thread (sorry, there' not much use in me reading here ;) ).

Two things:
JPEG can be lossless. A quality level of 10/10 will result only in lossless compression. Also, TIFFs can use all sorts of compression inside, or none at all. You can take a .RAW file, open it in a text editor, and type a TIFF header on it, and it will be opened just fine as a TIFF (handy when a program says it doesn't support .RAW). I don't believe Iced. I've worked on image compression too, and if you know something about what you're compressing, you can really up your compression ratio. We were doing 20:1 lossless, and 50:1 with no visually percievable loss.

Also, some people just said that speeding up RDRAM would cut its latency. This isn't so. RDRAM is a serial protocol, and some parts of its design just add more latency that can't be avoided. They did get low-voltage differential signalling in there, which was a smart move (see also HyperTransport), but other than that I don't like RDRAM in PCs that much. There's a saying around here "You can always get more bandwidth, if you're willing to pay for it, but not even God can give you better latency." Yeah, pin count will cost you, and that's part of the reason for RDRAMs in consoles. Also, RDRAM is a lot more feasible in those configurations than it is in the RIMMs that PCs require. Last time I checked, Rambus couldn't test the RIMMs until the heat dissipator was applied, and if a RIMM is bad, chips can't be removed - so the whole RIMM is trash. Puts a cramp on yield. (Even if you can't see it, your SDRAM sticks often have had a chip replaced at manufacturing time.

IcedEarth
6th December 2001, 15:12
Originally posted by Wombat


Two things:
JPEG can be lossless. A quality level of 10/10 will result only in lossless compression. Also, TIFFs can use all sorts of compression inside, or none at all. You can take a .RAW file, open it in a text editor, and type a TIFF header on it, and it will be opened just fine as a TIFF (handy when a program says it doesn't support .RAW). I don't believe Iced. I've worked on image compression too, and if you know something about what you're compressing, you can really up your compression ratio. We were doing 20:1 lossless, and 50:1 with no visually percievable loss.


Lossless JPG is called LJPG and is an entirely different method of compressing images. It doesn't have very much in common with the lossy JPG format. But you're right, there *is* lossless JPG.

I'm sure that with certain specific kinds of images you can get a 20:1 ratio in lossless mode but I dare you to compress a 32bit image with lots of detail with a ratio that is better than 2:1.
Of course there are also lots of 'nearly lossless' compression techniques but I was merely talking about pure lossless.

Check out http://www.bitjazz.com/statistics.html e.g. for some results of lossless compression of detailed images. Apparently, there are new techniques that achieve slightly better ratios but an avarage of 20:1 is out of the question.

Wombat
6th December 2001, 16:14
The original JPEG standard has lossless modes. If you crank the quality to the maximum, then there isn't any DCT work done, and it's just Huffman encoding.

We were compressing 15- to 22-bit grayscale images.

superfly
6th December 2001, 16:50
While a agree that the current AGP protocol and using current memory technology is still too slow to transfer and access large amounts of data quickly,i woudn't say that the situation won't ever change...

I mean like you mentioned,AGP 8x is on the horizon as well as faster versions of both DDR sdram as well as rdram running at higher frenquency's and wider buses as well.


Fast forward a couple more years and you'll see the first implementations of N3GIO tech,which lays the groundwork for even faster system bus speeds as well as even faster versions of AGP(16 x to begin with which has about~4 gb/sec transfer rate) and can also be applied to the next generation PCI spec beyond that of PCI-X(1 gig/sec transfer rate),which is due to show up in a few months,mostly on server boards for the time being.

One of the really cool features of the new AGP 8X spec is that it allows Graphic cards with multiple processors to work together without any driver tricks or specialised hardware needed on the cards themselves to fool the current AGP spec into thinking that there's only one graphics processor in order for the card to work properly in an agp 4x slot.

I do believe that sooner or later card makers won't have any other choice but to have multiple processors on their cards,for one because you can't keep on shrinking chips indefinately and smaller process tech is getting harder to develop each year since it takes about 18 to 24 months for a new fab process to become available and if you're on a six or even 12 month release scedule,well you're pretty much screwed,since at most you're forced to release slightly higher clocked versions of a previous generation in order to say that you've got something new.


Hmmmm...... that's sounds vagely familiar.doesn't it?....;).

IcedEarth
7th December 2001, 00:16
Originally posted by Wombat
The original JPEG standard has lossless modes. If you crank the quality to the maximum, then there isn't any DCT work done, and it's just Huffman encoding.

We were compressing 15- to 22-bit grayscale images.

Like I said, an entirely different method :D

Maggi
7th December 2001, 01:50
Originally posted by Wombat
...
JPEG can be lossless. A quality level of 10/10 will result only in lossless compression.
...

Hi Wombat,

I just cannot believe that ... :)

Example:
- take an uncompressed picture and save it as JPEG in 10/10 (or 12/12 or whatever is highest quality setting)
- close the document
- open up both documents (JPEG and uncompressed)
- merge them with 'difference' as layer option
- raise color gain (or adjust levels) so that very dark pixels get lighter

in my experience there always was a difference that gets more and more evident the harder you color correct the resulting 'difference merge', but if there was no loss when using JPEG, the difference would always be 0/0/0 RGB, speak a totally black picture.

What programm do you use for 'lossless' JPEG compression ?

SmellYfeeT
7th December 2001, 09:49
The original JPEG standard has lossless modes. If you crank the quality to the maximum, then there isn't any DCT work done, and it's just Huffman encoding.

Wombat,
I have to totally disagree with your statement!!! The whole point of JPEG is use of Discrete Cosine Transform to compress the energy of the picture. The huffman coding is always done on DCT coefficients. In order to acheive LOSSLESS JPEG, you need should not perform QUANTIZATION ON DCT COEFFICIENTS. That is the whole idea of JPEG, the bigger your quantization the bigger loss, if you do not quantize, you will not degrade the quality however at the same time you can achieve pretty good compression, since you are not huffman coding the picture, but rather its frequency components.

Huffman coding the picture is like zipping a normal data file. If anyone is interested more than this, I know a few good links with example to show how JPEG works.

As long as AGP8x goes, I am not sure if it would improve the performace that much, and there is no point right now anyways. Maybe next year, it would really matter, but I guess its a good point for marketing anyways :)

superfly
7th December 2001, 16:09
I completely agree that for the time being,there isn't much of a need for AGP 8X since developers knowingly limit themselves to what current protocols and memory tech can handle and for the most part,that won't change anytime soon,save for maybe games like DOOM 3,which will stress any system,no matter how high end it may be.


Id is one of the few companies out there with the financial freedom to do whatever they want and their strong point is making awsome game engines that allow other developers to build great games,so i can't wait to see other companies get their hands on that engine.....:).


It just a shame that even current video cards like the GF3 and radion 8500(and hopefully matrox) can handle so much more than the AGP 4x can dish out that we may never see what those cards can really do.

Wombat
7th December 2001, 18:26
Just to cover a few points:


Huffman coding the picture is like zipping a normal data file.

Exactly. And most jpeg compression programs do this by default. Maximum quality settings (according to the JPEG definition) do NO quantization of the coefficients. That's what I was talking about, but didn't know I could/should get into a detailed discussion of the DCT implementation here. Also, I think the "12/12" quality setting may be part of the newer JPEG standards. When I was doing this work in mid '99, they were still choosing the details of the next JPEG standard, as well as JPEG2000, which was something different. My bosses were involved in some of the selection. So, maybe my information is incomplete these days.

To address Maggi's questions, I was often using cjpeg and djpeg, the programs that you get when you compile the libjpeg6b code. My first guess at the differences you see Maggi are just palette downsampling. The libjpeg code can be compiled for 8-bit color or 12-bit color, those are the only options. Maybe other engines have more/other features, but I would guess that you're seeing the results of palette changes <I>before</I> the encoding was done. I've done the same kind of tests that you're talking about, and if the input image fits the requirements, the output file can be turned back into an identical image.

I do know what I'm talking about here. I read most of the JPEG reference textbook (whose name escapes me, that book stayed with that lab). I doubt if I can still do it, but I used to be able to open a JPEG file in a hex editor, and read the headers, picking out the tags and values for resolution, compression, and palette info. Lame, but true.

IcedEarth
8th December 2001, 02:35
Originally posted by Maggi


Hi Wombat,

I just cannot believe that ... :)

Example:
- take an uncompressed picture and save it as JPEG in 10/10 (or 12/12 or whatever is highest quality setting)
- close the document
- open up both documents (JPEG and uncompressed)
- merge them with 'difference' as layer option
- raise color gain (or adjust levels) so that very dark pixels get lighter

in my experience there always was a difference that gets more and more evident the harder you color correct the resulting 'difference merge', but if there was no loss when using JPEG, the difference would always be 0/0/0 RGB, speak a totally black picture.

What programm do you use for 'lossless' JPEG compression ?

He's right though. It's just that most programs don't implement the lossless version of JPG. That 10/10 value is an indication of the visual quality rather than a value for binary inversibility.

rubank
8th December 2001, 10:34
Maggi,

I have to agree with Icy and Womby.

If you do the operation you discribed using Photoshop 6 and set JPEG quality to "maximum" the result will be a 0/0/0 rgb image.
You can verify it by "adjust/equalize" whereby Photoshop answers that there is only one brightness value in the image.

If you on the other hand use Corel Photopaint 10, the best JPEG setting will not result in a lossless compression. You just have to magnify the image to see the brown pixels.

This is of course only interesting from a theoretical standpoint, for all practical reasons a mildly compressed JPEG image is indistinguishable from the uncompressed image on a monitor.

rubank

mute
8th December 2001, 13:08
oh and what is a matrox gc2000 ?

saw something in the new bios , there's something about a marvel g550 eTV
and a g550 engineering board
and g550 prototype
and matrox dual condor

orangejulius
8th December 2001, 14:58
You're right.

PBIOSWIN.EXE contains:
00067488: Matrox Marvel G550 eTV
000674B8: G550 Engineering Board
000674D0: G550 prototype
00067694: Matrox CG2000 (http://www.matrox.com/videoweb/news/press/pr/pr_cg2000.htm)
00067760: Matrox Dual Condor
00067910: Digital First Millennium G450 32S

PROGBIOS.EXE contains:
0004CB70: :filetype TOUCAN
0004CDDA: :filetype CONDOR

[GDI]Raptor
8th December 2001, 16:20
Dual Condor... Whats that..?? I you serch throug the latest driver from Matrox... What do yuo find??

KngtRider
8th December 2001, 20:23
cg2000 could mean condor g2000

if there is a dual, would YOU BUY IT THOUGH?

all the multi chip solutions didnt work well from all manufs for soho use

re g550 prototype - it could have been one with the full tl engine avalible to all apps for matrox to develop the tl on

g550 etv - they said no more low end capture cards

mute
9th December 2001, 04:19
ohh yeah now i remember the cg2000
if you still haven't seen what it is orangejulius posted the url in a previous post

wasn't the g450 codenamed condor ? dual could mean dualhead
i'd have to see in previous bios releases

Maggi
9th December 2001, 15:53
Hi Guys,

seriously, I didn't know there existed a lossless JPEG version and even the highest setting in Photoshop 6 is, unlike Rubank stated, not lossless.
I guess it heavily depends on the pictures contents, because my example resulted in a difference picture that contains 116 unique colors.

Uncompressed it is 1.440.056 bytes in size, in Photoshop JPEG 12/12 baseline optimized it is 336.560 bytes.
I stored the difference as GIF, because it has only 116 colors and hence GIF is smallest lossless format. The exagaration of that difference picture was done by raising gamma to 9 and it's also in GIF for above reasons.

I don't have any webspace, hence I'll attach the pictures one after another.

Here's the original, compressed quite heavily to match the max file size of this forum

Maggi
9th December 2001, 16:08
unfortunately I had to convert the difference pictures to JPG, because the GIFs were 318KB each ... doh !

I also had to lower the gamma, because teh resulting JPG was about 1.2MB in 12/12 and still +100KB in 1/1 ... :D

So here now the black looking difference picture without gamma correction

Maggi
9th December 2001, 16:14
Finally the difference with gamma set to 5, but be aware that the use of 5/5 as quality level lowered the picture detail.

omegaRED
9th December 2001, 16:33
Toucan is the codename for the G400, and Condor for the G450, dual Condor is obviously G450 DualHead ...

KngtRider
9th December 2001, 17:46
hes right, i rmeber running some g400 info utiltily and it said toucan rev whatever

Wombat
9th December 2001, 21:22
How do you know the GIF was lossless? Not critisizing, just want to know for my own understanding.

IcedEarth
10th December 2001, 05:19
Originally posted by omegaRED
Toucan is the codename for the G400, and Condor for the G450, dual Condor is obviously G450 DualHead ...

All G450's are dualhead by default so it's kinda weird to make a difference between a Condor and a dual Condor then; right?

rubank
10th December 2001, 05:39
Deleted to save bandwidth

Maggi
10th December 2001, 06:41
Originally posted by Wombat
How do you know the GIF was lossless? Not critisizing, just want to know for my own understanding.

Hi Wombat,

afaik is GIF able to store 256 unique colors (8 bit out of 24) and all my tests indicate that they are stored lossless. Since the difference picture above contained 116 colors (image > mode > indexed color = "exact"), GIF is able to store it as it is, without having to dither and/or round down.

:)


Hi Rubank,

I guess it depends on the actual picture contents, ie. in my case I scanned the pic on my own in very high resolution so that even the filmgrain was reproduced properly ... :D
I have no clue about your example, but it seems that there's not too much detail anyways and hence it might very well be the case that JPEG can store it lossless. Especially if your started of with a JPEG and saved it to TIFF or BMP, you wouldn't have any differences.
Just do a scan on your own and compare again.

Another aspect could be that to my knowlegde the JPEG fileformat use YUV colorspace while BMP uses RGB and TIFF is sort of free to use any kind of color channel. Most likely the according colorspace conversion already changes the picture, depending on the actually used colors.

Cheers,
Maggi

Indiana
10th December 2001, 06:57
GIF is lossless because it uses some Huffmann algorithm for packing which is a lossless one.
And while the normal JPEG compression is lossy due to it's algorithm regardless of the selected quality (Maggi is right here), there is in fact also a lossless JPEG variant, but this is usually a different codec and normally called LJPG. Could be that some Photoshop versions switch from JPEG to LJPG for the highest quality setting, leading to the different results people are getting here.

DukeP
10th December 2001, 07:03
Lol..

All very informative, but could we get back to speculating about the upcomming matrox card??

Btw TIF also has a variant that uses lossy compression.. ;)

IcedEarth
10th December 2001, 07:54
Originally posted by Indiana
GIF is lossless because it uses some Huffmann algorithm for packing which is a lossless one.

Uhm.... that's not true. As long as your color palette has no more than 256 colors, GIF is lossless. But if you save a 24-bit picture in GIF-format, it will only save 256 colors which results in a lossy version of the original 24-bit picture. That's what wombat meant I guess :)

rubank
10th December 2001, 08:05
Duke,
I can understand your frustration :eek:
but I have to answer the magpie :D

Maggi,
of course I wouldn´t use a JPEG file saved as a TIFF file, I´m not that kind of guy.
And as for detail, I´d say you can´t tell without seeing the pic in original size. Furthermore brightness values are just as important (as a measure of file contents) and if there is not ONE SINGLE PIXEL of difference, I´d say the compression is de facto lossless regardless of what the format dipicts.

Is the emperor naked? Yes. Does the pope have a beard? Yes. Can JPEG be lossless? Yes.
:D

rubank

Indiana
10th December 2001, 08:16
Originally posted by IcedEarth


Uhm.... that's not true. As long as your color palette has no more than 256 colors, GIF is lossless. But if you save a 24-bit picture in GIF-format, it will only save 256 colors which results in a lossy version of the original 24-bit picture. That's what wombat meant I guess :)

The compression algorithm used in GIF still is LOSSLESS by design in contrary to JPEG which has a lossy compression (independant of the quality-setting). Of course GIF has the limitation of supporting max. 8Bit / 256 colors, but that has nothing to do with it being a lossy or lossless compression. You simply cannot store a >256 color pic in GIF so the application has to downscale it to 256 colors before saving. The saved GIF still is lossless compared to the 256 color image it got from Photoshop/...
Another lossless format is PNG, this has the advantage of having higher compression ratios and supporting bitdepths >8.

The lossless LJPG is different in design from JPEG (i.e. not just a JPEG with a high quality-setting), it's merely from the same group - like JPEG2000 for example.

So in short:
1. GIF, PNG are lossless, JPG is lossy.
2. There is a lossless LJPG, yes but it uses completely different algorithm then the usual JPEG.

P.S.: Rubank, for general use you're right with your "defacto-lossless" statement, but not if you're doing many editing steps on a picture. The small losses will more than add with every step so you should use a REAL lossless format (where each pixel is EXACTLY the same it was) for any editing. You can still convert the final result to a more convenient format afterwards.

IcedEarth
10th December 2001, 09:45
I know Indiana, but what I said, was to support Wombat when he asked whether or not the GIF was lossless. In design GIF IS lossless but if the picture you saved in GIF only has 256 color while the original has a couple of thousands, I'd say the result is lossy. It's just a matter of point of view ;)

rev
10th December 2001, 10:33
guys, you are way toooo offtopic :)

we should really get back to the subject, does anybody know anything new? I'm really excited about the new matrox card, especially if it'll have specs we have mentioned. Some publication or info from matrox would be heaven :)

keep on rollin' guys...

rubank
10th December 2001, 10:37
I APPOLOGIZE TO YOU ALL AND WILL HUMBLY ACCEPT MY PUNISHMENT AND EAT A SHOVEL OF DIRT.

Maggi and Indiana are totally right, I just made an unforgivable mistake. I take back everything I said and state the opposite.

(I didn´t close and reopen the JPEG file before doing the difference calculations. In Photoshop that´s a problem, in Photopaint it isn´t) :o :o :o :o :o


rubank

Tempest
10th December 2001, 11:02
Originally posted by IcedEarth
In design GIF IS lossless but if the picture you saved in GIF only has 256 color while the original has a couple of thousands, I'd say the result is lossy. It's just a matter of point of view ;) It's not the file format that dithers/decreases those colors, it is the app that knows the limits of the GIF file format and does the conversion for you. So it's not poor GIF's fault if you try to save a picture with millions of colors and it doesn't show up as expected. It's like trying to save a BMP file with more than 2^24 colors, the result could be said to be lossy but the fact is that the file format was never designed to handle such information.

Oh yes, this is really getting offtopic now... But I'm sure that as soon as someone has some real news they will appear here :)
Edit: Oh yes, if someone wants some graphics related news then ATI has finally begun shipping the Radeon 8500DV.

IcedEarth
10th December 2001, 11:59
Originally posted by Tempest
It's not the file format that dithers/decreases those colors, it is the app that knows the limits of the GIF file format and does the conversion for you. So it's not poor GIF's fault if you try to save a picture with millions of colors and it doesn't show up as expected. It's like trying to save a BMP file with more than 2^24 colors, the result could be said to be lossy but the fact is that the file format was never designed to handle such information.

I know, I know... it's the prog, not the picformat but the result is still lossy isn't it? :D

BTW: I got myself a second hand MAX last month, while waiting for THE NEW MATROX VIDEO CARD! :D (getting back on-topic)

Maggi
10th December 2001, 13:48
no prob ... :D

Indiana
10th December 2001, 14:31
Originally posted by rev
guys, you are way toooo offtopic :)

we should really get back to the subject, does anybody know anything new?

Probably no, and this is the reason why this thread got that OT... :p

DukeP
11th December 2001, 14:36
On the other hand: It was by far funnier when people where arguing over something (althoug rather irrelevant), than this silence.

What about a guessing contest: What would be the price for this new magic Matrox card?
I might as well start..

/me dust off his magic but old crystal ball...

I foresee that the next magic highpowered superdeluxe dual petheradoctyl card from matrox will cost: 298$


;)

~~DukeP~~

Ant
11th December 2001, 14:37
£300 or £600 if you want the full 256MB of DDR memory :)

DukeP
11th December 2001, 14:43
ROFL!

:p

Well... I might even buy the full Monthy..
:confused:

~~DukeP~~

MK
11th December 2001, 16:02
Originally posted by Ant
£300 or £600 if you want the full 256MB of DDR memory :)

£600 ?? Hey! I don`t want to buy the whole company, I just want to buy a new graphics card !!

MK ;)

GT98
11th December 2001, 16:39
Originally posted by MK


£600 ?? Hey! I don`t want to buy the whole company, I just want to buy a new graphics card !!

MK ;)

I heard the same rumors....$300 bucks for the low end! Makes me wonder if they are using eDRAM since these prices are extreme!

Scott

Indiana
11th December 2001, 16:50
Still I think that begins to sound better.
In their prime time Matrox used to be the most expensive (but worth it!), until they decided to release el-cheapo OEM crap (G450/G550). If the new card finally again features all the best money can buy and is worth $500, I'll most probably get one,
(Even though my Radeon8500 should arrive tomorrow....)

Wombat
11th December 2001, 17:05
in contrary to JPEG which has a lossy compression (independant of the quality-setting).

This just isn't true. Lossy JPEG is lossy because of the quantization of the frequency coefficients. No quantization means no loss. And that is entriely possible with the JPEG file format, if the compressor doesn't quantize. I know, I've done it, dammit.

superfly
11th December 2001, 20:15
Just a quick question really....weren't we supposed to know what the bitboys were up to by now?....


I remember Nappe saying something to that effect a few weeks ago.....


I mean there's that ongoing bet at nvnews and 3dgpu that i'd love to see happen...lol.

snn47
11th December 2001, 22:33
Question: maybee I am to ignorant when it comes to memory, but why don't add a fast memory-controller=small buffer, that latches the data for several memory banks. Allowing the data transfer the data from the GPU in one clock to the latch and then let the buffer controller adress the memory banks as way for getting below CL222.

IcedEarth
12th December 2001, 00:21
Originally posted by Wombat


This just isn't true. Lossy JPEG is lossy because of the quantization of the frequency coefficients. No quantization means no loss. And that is entriely possible with the JPEG file format, if the compressor doesn't quantize. I know, I've done it, dammit.

Uhm, now I have to disagree with you Wombat. I've done this myself as well and without quantization JPG is still lossy. Maybe it depends on the implementation but since the calculations have to be made with real numbers, the inverse calculation always has a small error margin resulting in a slightly different reconstructed image.
Could have been bad programming on my part though :D

rubank
12th December 2001, 06:33
I should have read this earlier, I know. Yeah yeah I know.

http://www.faqs.org/faqs/jpeg-faq/part1/section-13.html

The JPEG organisation points to this FAQ so I guess it´s valid.



On another matter, I know this isn´t todays news but I haven´t seen any comment here on the subject of...

http://www.kentrontech.com/New_News/Press_2001/PressR20011206.htm

any of you clever guys have a comment? Would this tech "solve the bandwidth issue"?

rubank

GT98
12th December 2001, 08:37
Originally posted by superfly
Just a quick question really....weren't we supposed to know what the bitboys were up to by now?....


I remember Nappe saying something to that effect a few weeks ago.....


I mean there's that ongoing bet at nvnews and 3dgpu that i'd love to see happen...lol.

The lastest rumor I heard that was Bitboyz was gonna send out production samples for reviews to prove that they are doing something (yeah right!). If we dont hear anything with in the next month or so...I would totally write them off and not give them the time of day when it came to accouncments!

superfly
12th December 2001, 15:03
Oh well...i guess the guys at those sites have nothing to fear then....


But boy did they ever have some really wierd bets going on....One of them promised to eat a can of dogfood(with pictures to prove it) if bitboys actually delivered this time out....:D :D :D.