I'm wondering if anyone give me some advice about optimizing the compression/resolution of web-images.
Here's what's puzzling at this point (we're using only photoshop at this time) : we have some images that have outstanding resolution and detail, yet only show that they are using a very small amount of k. For example, some images are 14k, others are 46k, and so on...
Our jpeg images to start with are from 5 to 10mb, and so far, we've only been able to reduce images to about 150 to 200k before they start pixilatting... Can someone tell me what they might know, or be using that we don't know about yet?
ps. If there is another forum that would be more appropriate for this question, let me know.
Thanks.
Here's what's puzzling at this point (we're using only photoshop at this time) : we have some images that have outstanding resolution and detail, yet only show that they are using a very small amount of k. For example, some images are 14k, others are 46k, and so on...
Our jpeg images to start with are from 5 to 10mb, and so far, we've only been able to reduce images to about 150 to 200k before they start pixilatting... Can someone tell me what they might know, or be using that we don't know about yet?
ps. If there is another forum that would be more appropriate for this question, let me know.
Thanks.




