It was several times a issue here in the forum, that JAlbum only uses the sRGB color space and images that are in the AdobeRGB color space are therefor displayed in wrong colors. Until now the only option was to save the images in sRGB before loading them into JAlbum.
I just developed a filter for JAlbum which automatically converts AdobeRGB-Images to sRGB so that JAlbum will generate good looking images without having to change the original files. It is a simple filter that is applied to all images.
Here is an excerpt from the READ_ME file: How to install ============== Copy the content of this archive into the folder 'plugins' in your JAlbum-folder. How to apply ============ Open an album of images in AdobeRGB you want to convert. Open the Settings-Window. Go to the section 'Advanced'. Open the 'User Variables'-Tab. Set the name of the first variable to 'filter1' and its value to 'class=AdobeRGBtosRGBFilter prescale=true'.
Using this filter will NOT change the original images, only the images created by JAlbum. For further information on how to use filters in JAlbum go to: http://jalbum.net/filters.jsp WARNING! ======== This filter will be applied to all images in you album. The filter does not check whether or not the image is actually in the AdobeRGB Color Space. All images are recalculated as if they were in the AdobeRGB Color Space. This might cause color problems in images which are not in using the AdobeRGB color space (usually over saturation). So only use this filter if you know what you are doing.
For more information read the READ_ME.txt file in the attached archive.
I have tested it on Windows and Mac both times using JAlbum 7.3.1. Any comments on how it works for you are welcome.
I hope this helps all of you who had problems with this issue.
I just tried a "test" album with 32 photos in it using Jalbum 7.4 and Chameleon 4.3.1 and without the filter in place a make all took approximately 10 seconds to build. With the filter in place, the same album took 5 and a half minutes to build. Is this normal? I can't imagine what it would be like if I had several hundred photos in my album.
That's not surprising to me, as the filter goes through every single pixel and calculates the color of that pixel depending on the color of the original pixel. So the bigger the original images, the longer it takes to calculate them. It will certainly speed things up, if you set 'prescale' to 'false'. I wanted to use the colors as they where taken from the original image. It should work after the scaling, but I wanted to be on the save side and suggest using it before any other filters are applied, even prescale filters. I can not guarantee on the outcome if any other filter is applied. But if this is the only one, it really should be no problem.
I hope this helps.
After there was also a discussion on the performance of the filter in another thread of this forum ( http://jalbum.net/forum/thread.jspa?threadID=17235 ) I decided to look into that issue a little closer. It was actually worse than I thought. But the problem was not so much my code than rather the ColorSpace.toRGB(float colorvalue) function from Java which I used.
To avoid this problem I decided to build in a caching mechanism which stores the pairs of original and resulting color so the filter doesn't need to recalculate the colors which have been done already. The only problem was, that at a certain size the cache would cause a Java OutOfMemoryError as it became to big. So I implemented several levels of caching which would clear the cache at certain intervals.
Please note that the caching is enabled by default.
Again here is an excerpt from the new READ_ME file: Caching ======= In Version 2.0 a caching mechanism was introduced to reduce the calculation time. You can choose between 4 levels of caching: 'none', 'low', 'normal' and 'high'. To set the caching just add e.g. 'caching=high' to the value of the filter variable. The default setting is 'normal'. The cache saved the already calculated colors in pairs of original color and resulting color. The problem is, that up on a certain size it will throw a Java-OutOfMemoryError. The different levels say how often the cache is cleared to avoid this problem. High-caching means that there is one cache for the entire album. This is the fasted but also the must vulnerable way of caching and it will break at large images and/or large albums. Normal-caching means that the cache is cleared after every image. This will break only at large images. Low-caching means that the cache is reset after every image and if the cache exceeds a certain size. The default size is 100 000 elements. This value can be changed by adding e.g. 'maxCacheSize=200000' to the value of the filter variable. If the caching variable is set to 'none', no cache will be created. This is the safest and also the most time consuming way of generating your album.
I also recommend now to apply the filter with prescale=false as default. This should really be no problem as long as it is the first filter to be applied.
I hope this helps with the performance issue. Thanks to all of you who gave me feedback on this.