My first question is how do i know the number of DPI I need if i want to make a 13X19 print compare to a 4x6.
There is no fixed answer, and much depends on your personal standards and the subject-matter of the image. At normal viewing distances 300 ppi for a 4x6-inch print should produce very close to all the quality you can see, but if you're very picky or really look closely, then AFAIK the Pro-10 might benefit from as much as 600 ppi. The close-viewing limit pertains equally to 13x19 inch prints, but as a practical matter, IMOPO a good file of 200 ppi can produce a very nice 13x19 inch print. (And 600 ppi at 13x19 inches requires a 100 MP medium format digital back--do you have one of those?) I have had a few 24x30-inch prints made from 16 MP files, i.e., 136 ppi, and at normal viewing distances most people think they look very good. And I have framed in my office an 11x14-inch print from a crop from a 6 MP camera, giving me 175 ppi, which only the fairly picky would find lacking in resolution / detail.
But to hint at two points of elaboration, in case you're interested:
(1) If part of what you're looking for is to know
exactly what pixel dimensions to which you need to scale an image to make a certain-size print at a certain nominal resolution, then I suspect you'll find it quite difficult to get an accurate answer (tons of people will give you approximate answers, which are easy to calculate). There are many variables. For example, I strongly suspect that "300 ppi" minilab printers are really 12 ppmm, i.e., 304.8 ppi. But then paper dimensions are inexact, or exact in metric and approximate in English units, and both inkjets and wet printing machines need to overspray a bit to make sure there are no white borders. In all my years of looking, I found
one lab that that told you the exact dimensions its printers used; I remember, e.g., that their 8x10 inch prints used exactly 2456x3070 pixels. And of course the way inkjets simulate continuous tones with 4, 5, 6, 8, 9, 10, or 11 colors makes it a lot more complex--see the next point.
(2) You used "DPI" and may be confusing that with "PPI". For inkjets that's a big difference. DPI is Dots Per Inch. The Pro-10 can lay down a pattern of up to 4800 x 2400 dots per inch. But those dots aren't exactly dots of any visible color, but dots of the eight colors of ink it can print. PPI is Pixels Per Inch. What you edit are pixels. Pixels are much closer to representations of continuous tone. At the simplest level, if you sent a 300 ppi image to a 4800x2400 dpi inkjet (like the Pro-10), then the printer can put whatever combination of dots of its 8 ink colors in a 16x8-pixel matrix (the matrix being 1/300 x 1/300 inch) to simulate continuous tone. So if you have a dark green pixel in a 300 ppi picture that you print with a Pro-10, it can fill that 16x8-pixel matrix with some combination of cyan, yellow, gray, and black spots of ink. But really it's even more complicated than that. With sophisticated software, there's no reason why a printer can't be controlled more along the lines of--this is a simplified example--'You fed me a 3264x4912-pixel image and asked for a 13x19-inch print. I'll have to crop a little from the long side, but in real terms I have 3264 pixels / 13 inches = 251 ppi to work with. If I print at 2400x2400 dpi, then I will have 2400 /251 = 9.6x9.6 dots of ink with which to simulate the color of each pixel. I'll calculate exactly for that.' So theoretically--I doubt this is much of a practical issue!--with inkjets the higher PPI resolution you insist they print (e.g., 600 instead of 300), the less well they can simulate continuous tone; and conversely, the lower PPI you let them print at, the better they can simulate continuous tone. And really, the algorithms can get a lot more complicated. I suspect that almost all of us shouldn't worry about such things, and least unless we're writing printer-driver code or printing software.