Go Premium for a chance to win a PS4. Enter to Win

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 240
  • Last Modified:

High Resolution .gif files

Hello! I am trying to determine what the standard is for HD .gif files. What makes a .gif high resolution? For example, I know for my jpeg files this would begin at 300 dpi. Can someone tell me what metric I should be checking to ensure a .gif file is indeed high resolution? I do not deal with graphics and video normally, but .gif even less often, so please forgive me if this question is very basic and/or stupid :). I am using Windows 7 Pro 64bit on my computer.
0
mrosier
Asked:
mrosier
2 Solutions
 
☠ MASQ ☠Commented:
There is no specification for "HD GIF" as a format so you could simply base it on image size.
Don't confuse "HD GIF" with cinemagraphs which are a different animation technology but resemble GIFs with very high resolution.
0
 
Wayne HerbertCommented:
A clearing up of terminology is in order.  Resolution is the number of pixels (dots) per inch that a device can display.  

Consider your average computer monitor... it has a resolution of 96 pixels per inch, typically, call it a 100 for ease of calculation.  This has a resolution of 100 DPI, to use your terminology.

Consider your average inkjet or laser printer operating in standard mode.  It has a resolution of 300 pixels per inch, or 300 DPI.  Printers are also available, particularly in photo printers, that are capable of 600 DPI, 1200 DPI, and even more.

Now, let's turn to your image files.  Image files don't have a resolution, per se, the have a number of pixels in both the horizontal and vertical directions.  Let's assume that you have an image (either jpg or gif) that is 600 pixels horizontally and 600 vertically.

Let's display this upon your computer monitor.  Because your monitor is 100 DPI, you can see that the size of the picture on the display will be 600 / 100 = 6 inches.
 
Now, let's print out the same image on your 300 DPI printer.  Now you can see the actual size of the image on the paper will be 600 / 300 = 2 inches.

And if you printed out the same image on the 600 DPI printer, it would only come out one inch square (600 / 600 = 1 inch).
 
This, when one speaks of "high resolution" one is really saying how many pixels do I need in the image to get the size of the image I want on the output I am going to use.
 
So, let's say I want to have an image that completely fills the front side of a 3" x 5" postcard.  The printer tells me he needs minimum 300 DPI for printing.  You can quickly compute that the smallest image you can send him will be 900 (300 x 3) by 1500 (300 x 5).
 
On the other hand, if you want an image on a website that shows up about 3" x 5" (monitor DPI is far less precise than printers), you'll only need and image that is 300 x 500 pixels.

In the examples, below, the first is an image that is 600 x 600, the second is 300 x 300.  Which is "higher resolution"?  There's really no difference except for the number of pixels on each axis.
 
Be aware that quite a few image programs give you an option to specify pixels per inch when creating a new image.  This is relatively meaningless as the actual DPI is dependent upon the output device.  It just means that within the image program that is the ratio the program will use to size an image described in pixels.

600OnASide.png
300OnASide.png
0
 
mrosierAuthor Commented:
MASQ gave me the first solution which was short and sweet, but Wayne gave a lot of great info to understand why the solution is what it is.
0

Featured Post

Important Lessons on Recovering from Petya

In their most recent webinar, Skyport Systems explores ways to isolate and protect critical databases to keep the core of your company safe from harm.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now