Rockbox Technical Forums

Support and General Use => Audio Playback, Database and Playlists => Topic started by: senab on October 01, 2006, 08:13:22 AM

Title: Codec Efficiency Comparison Test (iPod)
Post by: senab on October 01, 2006, 08:13:22 AM
Intro

Being a bit bored today, I thought i'd do a bit of a comparison test of how efficient they are when given files with varying bitrates. I used most of the codec's supported by Rockbox, but left some out (WAV - obvious, AC/3 - fairly useless at low bitrates). And not to mention these results are only really useful for iPod's and other PortalPlayer based builds.

Method

To do this test I compiled a new 5G iPod Video build from cvs, but edited the debug_menu.c file to not display the buffer progress bars in the 'View Audio Thread' screen. This was because the progress bars would use extra cpu cycles to render, and I wanted make the cpu to be doing as much audio decoding as possible.

I then picked a track off a cd, and start encoding it with various encoders and settings (which can be seen in the table below). For a more detailed look at the files, this analysis was done here (http://senab.iddx.net/rockbox/codec_analysis.html).

I then loaded my iPod with my new Rockbox build and the files in a folder. When booting Rockbox I cleared my settings, thus making sure no DSP or equalizer was on. I also changed the WPS to one without a peakmeter (boxes_320x240).

I then played back each file one by one, leaving it on the WPS screen until around 50% of the track was played. I decided to wait to 50% because by then, the audio would be filled and the boost ratio would have levelled out. I then went into the Debug menu --> View Audio Thread, and waited for the boost ratio to become level before recording the value.

Results

See here (http://senab.iddx.net/rockbox/results.html)

Conclusion

After the recent optimizations to libFaad, AAC is decoding very nicely in realtime now. MP3 & Vorbis are decoding in very similar speeds across the bitrates, Vorbis has quality gains in lower bitrates though. Musepack was the most efficient lossy codec on test. I was quite suprised by just how well FLAC decoded even using q8 which is the highest compression level using libFLAC.


I'm not sure whether this test was anything good or worthwhile, but it may give you an indication of how efficient the codec's you're using are. Now of course, this test was done with a build not using any DSP's or fancy graphics, therefore if you're using these don't expect anything as low.
Title: Re: Codec Efficiency Comparison (iPod)
Post by: bk on October 01, 2006, 09:15:50 AM
I'm assuming by "iTunes" and "Nero" you mean LC-AAC, right? I don't use Nero but I read somewhere that at low bitrates it will encode in HE-AAC.

Interesting results. I would have expected libmad (MP3) to be the fastest.
Title: Re: Codec Efficiency Comparison (iPod)
Post by: senab on October 01, 2006, 10:45:44 AM
Yes LC-AAC. I put both Nero and iTunes in because Nero use's is "true" VBR, whereas iTunes' implementation of VBR is ABR, I just wanted to see if it makes any difference.  ;)
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on October 01, 2006, 03:41:46 PM
Here's a stupid WPS that may be helpful: It has no alternating sub-lines, no peakmeter, no progressbar, no statusbar, etc.

I does give you the essential information you need for testing: playlist length and current position, filetype, bitrate, VBR/CBR, etc.

Let me know if this is at all useful.  ;D

[attachment deleted by admin, too old]
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Rincewind on October 01, 2006, 06:42:15 PM
Let me know if this is at all useful.  ;D

No it's not  ;D
(but it might be useful for other people)

When I "test" performance I usually leave the settings just the way I am normally using rockbox, "No side-effects" is an utopia anyway  :-\
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: soap on October 01, 2006, 10:19:47 PM

When I "test" performance I usually leave the settings just the way I am normally using rockbox, "No side-effects" is an utopia anyway

While I understand that sentiment, give me a chance to explain why at this stage in the game it is counter-productive.

Runtime and efficiency data are most useful when they are collected in a consistent manner with the number of variables reduced and documented to the best of the tester's ability.

Look at the current status of the runtime wiki pages, for example.  http://www.rockbox.org/twiki/bin/view/Main/IpodRuntime and http://www.rockbox.org/twiki/bin/view/Main/IriverRuntime.
These pages, in their current state, (no offense to the generous people who contributed) are next to worthless when it comes to tracking Rockbox's progress in achieving longer runtimes.  Without the documentation of settings used there is no way to tell if the runtime is effected by CPU-hungry options or WPSs with lots of cycle-stealing eye-candy.  Without performing the same test with the same files on the original firmware there is no way to judge the condition of the tester's battery.

By the same token the Rockbox forums and Wiki are not exactly overflowing with runtime and efficiency data, despite the fact this would be a very easy way for non-developers to contribute.  If just 1% of the users unhappy with ipod battery life could be encouraged to...I must be smoking something.

Since there is so little data collected, I feel it is of utmost importance that the tester take that extra step to do the collection in a manner which is most useful to the study of performance as a whole, and not necessarily the most applicable to their personal usage patterns.  At the least I beg of them to collect both.

Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on October 02, 2006, 12:30:38 AM
@ Soap: I could not, no matter how hard I tried, have stated it better.  :)

We need a new wiki page (CodecPerformance maybe?) which hosts the necessary sample files and a good desription of how people should run the codec performance tests and how and where to report their findings. Using a single 3 minute WAV file encoded in every supported codec and every possible bitrate should do the trick.

The best solution would be to create a WPS that reports the boost ratio directly on screen! I have no idea if this is possible or not but a true TEST.wps (rwps) would rule.

Then we need to expand BatteryRuntime to include a standard sample album and standard settings and WPSes etc and an explicit procedure on how to test and how to report back findings for battery runtime.

Then we need to solicit participation from the users. I bet IriverRuntime and IpodRuntime become full of data very quickly.

People are itching to help. Most just don't know how to get 'jiggy' with the C code. Myself included.  ;D

Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: mnhnhyouh on October 02, 2006, 01:46:29 AM
*sticks hand up to help with testing*

h
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Rincewind on October 02, 2006, 10:56:34 AM
I didn't want to critisize your testing methods. To compare settings it is essential to minimize the influence of other factors.

My post wasn't very serious. Maybe I should have put more emoticons in ;)

The best solution would be to create a WPS that reports the boost ratio directly on screen! I have no idea if this is possible or not but a true TEST.wps (rwps) would rule
Yes, I was thinking of that, too. I don't know if it easy to do, because these values are probably in seperate threads. If I can do it in a few hours I might try it next weekend.

Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Llorean on October 02, 2006, 11:07:46 AM
Why would a WPS need to show the boost ratio at all? Why can't you just leave it on the audio thread screen?
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on October 02, 2006, 12:17:28 PM
You're correct. Better to slightly expand the Audio Thread screen.

If the audio thread screen told you the current filetype and bitrate as well as time remaining in the current track we'd be set.

 ;D
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Llorean on October 02, 2006, 01:29:31 PM
Why is all that information important to a codec efficiency test? I would assume that if you were testing it, you'd prepare a playlist containing only the desired filetype and bitrate in advance. As for time remaining in current track, I'm not so sure I see that as useful at all for this.

I'm just saying, I'm not sure any coding actually needs to be done at all, though I could easily be missing where the information would be valuable on that screen.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on October 02, 2006, 02:14:43 PM
Maybe I am misunderstanding something or there is a bug in the "View Audio Thread" screen which is causing confusion...
--> What exactly does the "track count" tell us? It is not reporting what I expect it to.

As far as reporting some file info in the View Audio Thread screen:
It would reduce human error and be much easier of the filename, the filetype and the bitrate where displayed on the View Audio Thread screen. If multiple samplerates are implemented then we should display that too (if at all possible)

Since Senab has disabled the "bufferbars" anyway which requires a patching debug_menu.c I'm figuring it's no different to the tester if we add a line on screen that reports file info.

In fact, I vote that if this change gets implemented it should be commited to CVS. It's just useful IMO.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on October 02, 2006, 06:36:15 PM
Line 350 of apps/debug_menu.c insert -->

struct mp3entry* id3;
id3 = audio_current_track();
       
snprintf(buf, sizeof(buf), "filename:");
lcd_puts(0, line++, buf);
       
snprintf(buf, sizeof(buf), "%s", id3->path);
lcd_puts(0, line++, buf);

snprintf(buf, sizeof(buf), "codectype: %d", id3->codectype);
lcd_puts(0, line++, buf);
       
snprintf(buf, sizeof(buf), "bitrate: %d", id3->bitrate);
lcd_puts(0, line++, buf);

Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Llorean on October 02, 2006, 06:37:35 PM
I'm still not sure why that's at all relevant to a codec efficiency test. All it does it inflate the binary a wee bit more with information already available in plenty of places, and in fact, information that the tester *should* have decided on before starting the test.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on October 02, 2006, 06:49:13 PM
If I'm testing 21 files I want to make sure I'm testing the file I'm supposed to be testing.
I know myself and I know I'll get distracted or miss a file or whatever. This way confusion is impossible.  I also want to minimize the WPS as much as possible in terms of CPU load. If we're to solicit the regular users (like me) in helping us test I feel it necessary to give something clear and comprehensible to look at.

Don't worry, I don't have CVS write access and I don't want it.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Rincewind on October 03, 2006, 05:09:30 PM
maybe a nice solution would be a real (unofficial, nothing in cvs of course) testing version of rockbox with dedicated testing features like flushing the audio thread, clearing buffers with button presses, triggered timers...

but this would increase cpu load a little bit. But the question is: do we want to find out what the settings with the best cpu efficiency are or do we want to find the "best" codec?
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: senab on October 05, 2006, 03:49:35 AM
How do you define the best decoder then?

Some people want more battery life
Some people want to fit more music on their player
Some people want both

What's best is personal to the user  ;)
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on October 05, 2006, 10:52:42 AM
My intention was just to see how different codecs performed across targets at a given time.

(been busy lately, will get to the wiki page ASAP)
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: soap on October 07, 2006, 07:08:08 PM
The wiki entry is looking nice, but after looking at Senab's encode.bat I feel it is important to say that these are not CBR encodings - despite the implications made by both the filenames and the wiki page table.

 
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on October 07, 2006, 07:17:14 PM
OK here is the (very preliminary) wiki page. http://www.rockbox.org/twiki/bin/view/Main/CodecPerformanceComparison

I fixed senab's batchfile (it had a tiny bug in it) and I modified the patch to disply filename, codec, and bitrate in the 'view audio thread' screen.

Let me know what you all think. Running tests now.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: saratoga on October 07, 2006, 07:58:16 PM
OK here is the (very preliminary) wiki page. http://www.rockbox.org/twiki/bin/view/Main/CodecPerformanceComparison

I fixed senab's batchfile (it had a tiny bug in it) and I modified the patch to disply filename, codec, and bitrate in the 'view audio thread' screen.

Let me know what you all think. Running tests now.


That looks fantastic.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: soap on October 07, 2006, 08:39:16 PM
Regarding again the target bitrates:
What is the bitrate variation amongst the encoded files?  Are the mp3 encoded files within reasonable tolerances of their ogg counterparts?
The reason I ask is if the "128kb/s" MP3 is actually 120, and the "128kb/s" OGG is actually 132 (or vise versa) that would be more than enough to call the results into question.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on October 08, 2006, 12:38:33 PM
The reason I ask is if the "128kb/s" MP3 is actually 120, and the "128kb/s" OGG is actually 132 (or vise versa) that would be more than enough to call the results into question.

Hmm. Agreed, I'll modify the batch file to encode in CBR in all possible instances.

This morning I was trying to test with an iRiver H3x0 and realized that since there is no HD activity LED on the unit a quick way to determine whether or not there was any disk activity during the boost ratio calculation would be to enable the status bar in the "view audio thread" screen. Good idea?

Of course I have no idea how to do this codewise.  ;D

Any care to chime in? I would modify the VAT screen patch accordingly.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: bk on October 08, 2006, 01:05:53 PM
Hmm. Agreed, I'll modify the batch file to encode in CBR in all possible instances.

Forcing CBR for natively VBR formats (Ogg Vorbis) would be misleading. I think the value of these benchmarks is not necessarily in the comparison but rather in the absolute boost level per target. In other words it's less useful to compare codec A to codec B, but more useful to know that codec A is 20% slower on ARM relative to ColdFire.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: soap on October 08, 2006, 02:21:26 PM
Hmm. Agreed, I'll modify the batch file to encode in CBR in all possible instances.

Forcing CBR for natively VBR formats (Ogg Vorbis) would be misleading. I think the value of these benchmarks is not necessarily in the comparison but rather in the absolute boost level per target. In other words it's less useful to compare codec A to codec B, but more useful to know that codec A is 20% slower on ARM relative to ColdFire.

Forcing at least ABR is a must for this test to mean anything.  Comparing MP3 to OGG when the bitrates differ substantially is worthless.
(http://img175.imageshack.us/img175/1558/image1hp4.jpg)
As for the original encodings, only 192 and 256 are useful, IMHO.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: senab on October 08, 2006, 04:39:07 PM
Yes this test wasn't really a test on bitrates against each other and you couldn't really do such a test. Bitrates will change on the individual sample, (ie Vorbis q6 is supposedly ~192, but gives me ~180 very often).

What you can do with the results is plot a bitrate vs boost ratio scatter graph and then create a line of best fit to compare how the bitrates between teh codec's.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: soap on October 08, 2006, 04:55:46 PM
Yes this test wasn't really a test on bitrates against each other and you couldn't really do such a test. Bitrates will change on the individual sample, (ie Vorbis q6 is supposedly ~192, but gives me ~180 very often).

What you can do with the results is plot a bitrate vs boost ratio scatter graph and then create a line of best fit to compare how the bitrates between teh codec's.

Why couldn't you force ABR or CBR on the codecs that support it?  Even if the different codecs allocate bits inconsistently throughout the song, the test is the average boost percentage for the song is it not?  So as long as the average bitrates are comparable the test should provide a valid comparison.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: senab on October 08, 2006, 04:59:11 PM
Well thats it, would using CBR put less of a strain on the CPU? It would be an unfair test then. Here's the type of graph I was thinking of:

(http://senab.iddx.net/rockbox/boost.png)
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: soap on October 08, 2006, 05:10:19 PM
Maybe CBR would present less of a strain, but I believe most/all support ABR.

Regardless, your graph has won me over.
EDIT:
(as long as the X-axis is the actually achieved ABR, not the attempted BR)
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Llorean on October 08, 2006, 05:12:30 PM
Hmm...

There is ONE small problem about all this: It's hard/worthless to compare between say, Coldfire and ARM still.

Because of differences both in processor architecture, unboosted and boosted speed, and operating system overhead, boost ratio really wouldn't meant that much when comparing the two.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: soap on October 08, 2006, 05:17:07 PM
I think the plan was all along to collect two different data-sets and never the two shall meet.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: senab on October 08, 2006, 05:19:06 PM
It's the achieved ABR ;)

Yes, ABR would probably be the better option, but most codec's are optimizied for VBR output. Vorbis has no true CBR output, neither does Musepack or Nero; even the iTunes CBR bitrate can fluctuate.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Llorean on October 08, 2006, 05:21:38 PM
Another problem (native to the coldfire on H100 at least) then, is that MP3 in MANY cases can exist unboosted.

It may be necessary to clock down the processor further before starting testing, so that you don't have worthless data points (in my mind 0% boost is without value since you don't know if it's exactly 0%, or less, or more).

The real BEST way to test codec efficiency is this I think:

Create a transcoder to WAV. Then, time how long the transcode takes (while boosted the whole time) for files of various bitrates/qualities.

Do the transcode without yielding so that 100% of CPU time goes only to it. Don't even bother saving the data, just discard.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: soap on October 08, 2006, 05:29:26 PM
That methodology seems sound.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: bk on October 08, 2006, 05:40:07 PM
There is ONE small problem about all this: It's hard/worthless to compare between say, Coldfire and ARM still.

Because of differences both in processor architecture, unboosted and boosted speed, and operating system overhead, boost ratio really wouldn't meant that much when comparing the two.

Not true. If a codec is running at 90% boost on one arch and 60% boost on another, then clearly the optimization effort would be best directed towards the first architecture. "Operating system overhead" is meaningless since Rockbox is running on both.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Llorean on October 08, 2006, 06:34:52 PM
You clearly don't understand the situation.

On on architecture, it's boosting from 30 to 75. On another it's boosting from I think 45 to 124. So, the 45 one may boost 0% of the time simply because the codec runs fullspeed at 45, while it boosts some of the time because it doesn't run full speed at 30.

It is an unequal comparision of how optimized the codecs are simply because the processors are in seperate conditions.

Operating System Overhead IS DIFFERENT between different hardware. Because Rockbox is being compiled into ARM assembly vs M68K assembly operations take different amounts of time to complete. This means that basic code like User Input handling can be less or more efficient on one architecture than another. Then when you get into the code, because one has different inputs than the other, this introduces further differences in operating overhead. Take into account that each screen requires a different amount of time to update, and it will always be updating as long as you yield to the UI thread, and you get even *more* differences in operating overhead because the OS on one hardware has to spend more time drawing than on the other.

There are VAST differences in how Rockbox itself performs on different hardwares, completely independent of codec performance.

For examples of this, try scrolling in long lists on an H300 vs an H100, or on an iPod Nano vs an iPod Photo vs an iPod Video.


Using a transcoder at full boost that does not yield you accomplish a few things:
1) You prevent any other code from executing, insuring that you're ONLY timing the codec itself.
2) You are running the codecs using the full power of the processor without ANY questionable overhead, I believe, which then means that you have MP3@128 on ARM7 @ 75mhz, vs MP3@128 on M68K @ 124mhz. Then you at least only have three real variables: The codec itself, and the maximum processor speed, and the differences architecture makes (A 75mhz ARM7tdmi is not the same actual speed as a 75mh M68K Coldfire).

As far as I'm aware, there's a ridiculous amount of things that can cause a variance in the results.

If it's running at 90% boost on a processor that only goes up to 60mhz and runs at 25mhz idle, vs 60% boost on one that runs at 50mhz idle and 200mhz boosted, its pretty clear that it's running less efficiently on the latter, and this case could reasonably come up. The only proper solution is to figure out how efficiently it runs relative to the architecture itself, which means you need to be able to establish a performance value that is independent of actual processor speed and any hardware you can possible remove from the equation.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: bk on October 08, 2006, 06:43:22 PM
You clearly don't understand the situation.

Do me the favor of not presuming what I do or do not know.

Quote
On on architecture, it's boosting from 30 to 75. On another it's boosting from I think 45 to 124. So, the 45 one may boost 0% of the time simply because the codec runs fullspeed at 45, while it boosts some of the time because it doesn't run full speed at 30.

a) On different architectures cycles are not 100% equivalent; b) if the goal is to increase performance of various codecs then boost ratio is an adequate metric to use, regardless of hardware differences.

Quote
It is an unequal comparision of how optimized the codecs are simply because the processors are in seperate conditions.

Somewhat true but irrelevant. If codec A has 10% boost on ARM and 75% boost on ColdFire then optimization effort on ARM is not as worthwhile as effort spent on ColdFire for that codec.

Quote
Operating System Overhead IS DIFFERENT between different hardware. Because Rockbox is being compiled into ARM assembly vs M68K assembly operations take different amounts of time to complete. This means that basic code like User Input handling can be less or more efficient on one architecture than another. Then when you get into the code, because one has different inputs than the other, this introduces further differences in operating overhead. Take into account that each screen requires a different amount of time to update, and it will always be updating as long as you yield to the UI thread, and you get even *more* differences in operating overhead because the OS on one hardware has to spend more time drawing than on the other.

Also do me the favor of not explaining the process of compilation as if I was a child. Rockbox is the operating environment on all architectures: if there are threading inefficiencies (for example) on one target affecting codec performance then it would be exposed through these tests (indirectly) and could be addressed. This is the entire point of performance benchmarking.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Llorean on October 08, 2006, 06:49:18 PM
In response to your "Cycles are not 100% equivalent" you may have noticed in many places I mentioned that very point.

You seem to think that all that matters is its boost ratio. Which is completely pointless since the processors run at different speeds, the ARM core we currently have being half the speed of the coldfire.

Your theory is that even if a codec is running *faster* on one architecture than on another, if the processor is slower and thusly it must boost more, concentration should go there. Which is silly to an extent, because the codec is actually *more* efficient in that situation, and optimization efforts could very well be wasted time or diminishing returns. It's better to know an absolute value of efficiency. You can still concentrate more on the slower processors if the speed/efficiency ratio means that processor will be boosting more, but it also gives you a benchmark upon which to base  overall speed for that architecture. Particularly useful in considering future targets. For example, if you're considering a slower ARM based target, you can have a better idea what may or may not be feasible on it.


And the reason I suggested you don't understand what's going on, is because you said ""Operating system overhead" is meaningless since Rockbox is running on both." which is what I responded to with that. It is clearly an untrue statement, as this thread is relating to CODEC EFFICIENCY, which means the tests should be related to the Codecs themselves, not Rockbox performance on a given system. It does differ, and hardware (particularly screens) interfere GREATLY and cannot necessarily be improved. This is not a fact that can be simply washed away with the word "irrelevant."

Otherwise we can simply say "All codec optimization efforts should happen on the 5G iPod(or perhaps 3G iPod)" because simply put it has the single worst playback performance of any current system other than the 3G iPod with its broken cache.

Edit: As a side note, please don't talk back to me about explaining compilation. The VAST majority of users here are not very aware of such topics, and I have no way of knowing whether you are or aren't. It is not speaking down to you as if you are a child, as VERY few childs understand that concept. It's speaking down to you as if you were "average" which is often not considered speaking down to someone at all, simply speaking to someone who may not be aware of the details of a technical concept. If you cannot discuss this idea without getting offended, I suggest you step away from it for a few hours and coming back when you have cooled down, as in no way is ANYTHING I say here meant to belittle you. Just state my perception on the topic. As far as I'm concerned, I said nothing untrue up to and including the fact that your statements demonstrated a lack of knowledge about the causes of differing performances across hardware and the relevancy of operating system overhead on the topic of "Codec Efficiency Comparison".
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: saratoga on October 08, 2006, 07:56:14 PM
There is ONE small problem about all this: It's hard/worthless to compare between say, Coldfire and ARM still.

Because of differences both in processor architecture, unboosted and boosted speed, and operating system overhead, boost ratio really wouldn't meant that much when comparing the two.

Not true. If a codec is running at 90% boost on one arch and 60% boost on another, then clearly the optimization effort would be best directed towards the first architecture. "Operating system overhead" is meaningless since Rockbox is running on both.

I think what Llorean meant was that boost doesn't tell you which is faster, not that the numbers themselves were useless.

Quote
On on architecture, it's boosting from 30 to 75. On another it's boosting from I think 45 to 124. So, the 45 one may boost 0% of the time simply because the codec runs fullspeed at 45, while it boosts some of the time because it doesn't run full speed at 30.

It is an unequal comparision of how optimized the codecs are simply because the processors are in seperate conditions.

I think bk just means comparing the relative boost ratios tells you which platform needs optimization most.  I'm not really sure why we would care about that, but its a valid point.  I agree with you that it doesn't really mean anything though.

Quote
Somewhat true but irrelevant. If codec A has 10% boost on ARM and 75% boost on ColdFire then optimization effort on ARM is not as worthwhile as effort spent on ColdFire for that codec.

I don't see how you can conclude this.  Given the differences in ISA, power consumption and battery capacity, its entirely possible that they're equally worthwhile.  For instance, Coldfire could be highly optimized, but poorly suited for the task, while ARM could be poorly optimized, but well suited.  

At any rate, since the developers working on each platform are different people, its not very relevent.  Knowing that X needs optimization more then Y doesn't help if theres a fixed group of people who work on X and a seperate group that only work on Y.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Llorean on October 08, 2006, 08:00:45 PM
As far as I'm concerned testing boost only gives you relevant information over the normal playback condition, and comparing it boost between codecs on the same platform is fine, but between platforms destroys the value of the data. If all the codecs require more boosting on one system than the other, is it that all the codecs are still poorly optimized, or is it that the playback code requires effort?

Knowing that an MP3 decodes at 300% speed on ARM7 @ 75mhz, and 350% speed on M68K @ 124mhz is much more relevant information to overall codec efficiency on those platforms.

If this were a Playback Efficiency thread then I would gladly admit that operating system concerns would be relevant to the topic (and they are relevant to users) but either way they should still be measured entirely seperately of the codec (playback system should probably be measured with WAV files).
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on October 09, 2006, 01:10:22 PM
In trying to refine the testing process:

CBR, ABR or VBR? I think whatever we choose should be the most "default" coding option. I.E. the encoding options used the  majority of the time. What is the batchfile going to tell the encoders to do exactly? Do we have concensus on this? My vote is use the encoder defaults because presumably that's what most file are encoded with.

OK, so the Coldfire and ARM data sets should not be merged. Do we need to separate the different PortalPlayer chips? PP5002, PP5020, etc?

Coldfire Set:
iriver iHP-100    
iriver H120    
iriver H140    
iriver H320    
iriver H340    
iAudio X5

ARM Set:
iriver H10 5/6GB    
iriver H10 20GB
iPod 3G    
iPod 4G
iPod mini (1G & 2G)
iPod Color
iPod Nano
iPod Video

 ??? What about the statusbar in "VAT" screen for HD activity? Many targets do not have an HD activity LED. It's nice to be able to see if disk access is interfering with the CPU boost ratio. Please post some code for me so that I may expand the debug_menu.c patch further.

About the transcoder to WAV plugin:
This is a great idea. The plugin could output performance data to a logfile. Just like the battery_bench plugin. This would be MUCH EASIER for the user to do.

Once the Codec_Bench plugin is written (hint hint) this entire thread becomes moot right? We would just tell users to download and encode the test files and run the plugin until it is finished. Right?
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Llorean on October 09, 2006, 01:45:48 PM
A more useful feature would be a transcoding interface that just happens to have a "benchmark" option somewhere that causes it not to output a file. ;)


For your list of players, the primary things that will make a difference are: Screen and Processor. The available RAM should be entirely irrelevant to an actual codec test as long as you're making sure the whole file gets buffered, so the H110, H115 and H120 should perform identically to the H140, and the same is true between 320 and 340. The iPods are all different though because of either different screens or different PortalPlayer chipsets.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on October 09, 2006, 01:55:30 PM
A more useful feature would be a transcoding interface that just happens to have a "benchmark" option somewhere that causes it not to output a file.

What I meant by output was just a logfile that listed decoding times and whatever else was relevant to the test...

Example:

Target and Build Date.

flac_5.flac   --> time.
flac_8.flac   --> time.
lame_096.mp3  --> time.
lame_128.mp3  --> time.
lame_192.mp3  --> time.
lame_256.mp3  --> time.
lame_320.mp3  --> time.
mpc_096.mpc   --> time.
mpc_128.mpc   --> time.
mpc_170.mpc   --> time.
mpc_224.mpc   --> time.
mpc_300.mpc   --> time.
mpc_350.mpc   --> time.
nero_096.m4a  --> time.
nero_128.m4a  --> time.
nero_192.m4a  --> time.
nero_256.m4a  --> time.
nero_320.m4a  --> time.
nero_400.m4a  --> time.
vorbis_096.ogg--> time.
vorbis_128.ogg--> time.
vorbis_192.ogg--> time.
vorbis_256.ogg--> time.
vorbis_350.ogg--> time.
vorbis_500.ogg--> time.
wv_fastx3.wv  --> time.
wv_normx4.wv  --> time.


That would rule and take all of the human error out. It would also be easy for the users to do.

Although, now that I think about it a transcoder that does "non-realtime-format-->WAV" would be a great tool for playing back files that are not playable on Rockbox (yet) while not in front of a computer. Hmmm!  :)
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Llorean on October 09, 2006, 02:02:37 PM
No, I understand what you meant by output. Rather, what I meant was that instead of a codec benchmark plugin, it'd be more useful to spend the time working on a full transcoding interface, so you can take *any* supported file, and have it decoded to WAV and then encoded to the format of your choice.

Then, once that exists, a benchmark option could be added to it that fits most of the needs above, and as a bonus because it's a full transcode interface, could benchmark both encoding and decoding codecs. :)

By the "instead of outputting" bit, I just meant that when using the interface to benchmark, instead of saving a file it could log it or output time results to the screen.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on October 09, 2006, 02:07:26 PM
Flyspray feture request? (plugin section)

I think this is a tremendous idea.

Especially since non-realtime encoded files could then be decoded and at least played without being in front of a computer.

Me likee.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Llorean on October 09, 2006, 02:12:13 PM
http://www.rockbox.org/tracker/task/6152 This read alright?
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on October 09, 2006, 02:28:18 PM
Perfect! If we standardize the test files we'll get some useable results back right? Maybe this could be bundled with battery_bench and it could be a two-in one test?

Something like this?

Attention all users that want to help:
Turn on the TSR plugin. (Transcode and Battery Bench)
Play the "test file folder" in a loop untill the player shuts off.
Upload the logfile(s) somewhere.

Sounds easy enough for everyone! :)

What am I saying? I must be smoking too...  ;D
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: bk on October 09, 2006, 05:07:18 PM
I think bk just means comparing the relative boost ratios tells you which platform needs optimization most.  I'm not really sure why we would care about that, but its a valid point.  I agree with you that it doesn't really mean anything though.

Assuming all codecs run realtime for all valid bitrates, the only reason to optimize further is to improve battery life. Otherwise all codecs could run at 100% boost for all we care. In that context be useful to know which codecs drain battery the most, and see if it is possible to improve it.

I personally don't see the point in seeing which codec is 'best' (the original intent of these tests), performance testing is only useful if it can be used to improve the codebase.

Quote
I don't see how you can conclude this.  Given the differences in ISA, power consumption and battery capacity, its entirely possible that they're equally worthwhile.  For instance, Coldfire could be highly optimized, but poorly suited for the task, while ARM could be poorly optimized, but well suited.  

I don't understand what you mean by 'well suited'. These are all general purpose embedded processors, if code is well optimized for ColdFire it will run fast and the tests will show that. Likewise for ARM, etc.

Quote
At any rate, since the developers working on each platform are different people, its not very relevent.  Knowing that X needs optimization more then Y doesn't help if theres a fixed group of people who work on X and a seperate group that only work on Y.

Agreed, work can only be done when there are people able and willing to spend time working on the various targets.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: soap on October 09, 2006, 06:28:42 PM
I personally don't see the point in seeing which codec is 'best' (the original intent of these tests), performance testing is only useful if it can be used to improve the codebase.

Regardless of who said what, the original point of the test was simply to collect data.  You can't have too much objective data.  (let's not labour that point (for we could I guess)) Tests measure, nothing more.  "Which codec is best" is a judgment, not the subject of a test.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: saratoga on October 09, 2006, 08:05:20 PM
Quote
I don't see how you can conclude this.  Given the differences in ISA, power consumption and battery capacity, its entirely possible that they're equally worthwhile.  For instance, Coldfire could be highly optimized, but poorly suited for the task, while ARM could be poorly optimized, but well suited.  

I don't understand what you mean by 'well suited'. These are all general purpose embedded processors, if code is well optimized for ColdFire it will run fast and the tests will show that. Likewise for ARM, etc.

Theres no reason well optimized code will run fast.  The problem may simply difficult or poorly suited for the hardware.  For instance, code that depends on coldfire's MAC may never run well on PP CPUs that have to do seperate multiply and add operations.
Title: Re: Codec Efficiency Comparison Test (iPod)
Post by: Davide-NYC on July 08, 2007, 02:32:30 PM
Now that the test_codec plugin exists we can now revisit this topic.

The test plugin, when using the "Speed Test Folder" option outputs a very simple log file. If we could append the results in a Twiki friendly format to the end of the same file we'd be on our way to getting many users to submit results for all targets. (I'm not sure who submitted the plugin to begin with)

See here (http://www.rockbox.org/twiki/bin/view/Main/CodecPerformanceComparison) if you're not sure what I'm talking about.