Options

Sharpening for Metal Prints

AdamideasAdamideas Registered Users Posts: 30 Big grins
edited May 16, 2012 in Finishing School
The first metal print I ordered really jumped off the page and blew everybody away. Everybody but me. Don't get me wrong, I did like it but felt it lacked detail and sharpness when compared to a traditional print. Even though when viewing the image (12x18) from a few feet away I didn't notice the lack of detail, it still bothered me and I wanted to figure out how why. There really aren't very many threads on dgrin that talk about it.

While this may already be known to some of us, I did some research and found that its actually high heat and high pressure that effectively turns the ink into a gas and transfers it to a substrate on the aluminum. Its not actually "in" the aluminum. So to me this clearly explains why we loose a little sharpness.

I recently got a metal print of an owl. I decided to oversharpen the fine feathers to counteract the loss of detail that occurs during the process. I felt it worked pretty well but could have used even more sharpening.

To avoid an expensive trial and error process one thing I considered but have yet to do was to get a metal print featuring a grid of 1x1" tiled images all with different sharpening settings and and judge how each tile faired. Yeah, same kind of idea as test strips in the dark room.

Does anyone have any advice they can share about printing on metal? or any interesting theories/ideas?

Adam

Comments

  • Options
    angevin1angevin1 Registered Users Posts: 3,403 Major grins
    edited May 11, 2012
    Adamideas wrote: »
    The first metal print I ordered really jumped off the page and blew everybody away. Everybody but me. Don't get me wrong, I did like it but felt it lacked detail and sharpness when compared to a traditional print. Even though when viewing the image (12x18) from a few feet away I didn't notice the lack of detail, it still bothered me and I wanted to figure out how why.

    Adam is it at all possible to post a link to the image here in this thread, hopefully a Full Sized one?

    What few Metal prints I have had done all look fab, and they are 20x30inch.
    tom wise
  • Options
    arodneyarodney Registered Users Posts: 2,005 Major grins
    edited May 12, 2012
    Seeing an example on-screen isn’t going to help much as what you see for output sharpening to a printer on screen isn’t what you’ll get (it isn’t WYSIWYG). Using a small test for output is a good idea, I’d be careful in how you select image types for this test. There are images that are high and low frequency and they need to be treated differently. If you run an imaginary line horizontally over an image, you can determine if it is high or low frequency. For example, a portrait would generally be low frequency as there is very little light and dark contours seen over this imagery line. It is mostly skin until you get to some higher frequency areas (eyebrows, eyelashes). Take a shot of a forest scene. High frequency (lots of lights and darks over this line). The feathers you speak of sound like high frequency. So build a test with both types of images and season to taste.

    Also output sharpening is not only subject dependant but resolution and output device dependant. You are handling the output dependencies in your test since you are actually sending the test to one device. That leaves the resolution of the data which you can conform to each time, just be aware that sending say an 8x10 @ 300ppi requires different handling than an 8x10@ 200ppi.

    This too might help:
    http://www.creativepro.com/story/feature/20357.html
    Andrew Rodney
    Author "Color Management for Photographers"
    http://www.digitaldog.net/
  • Options
    angevin1angevin1 Registered Users Posts: 3,403 Major grins
    edited May 12, 2012
    arodney wrote: »
    Seeing an example on-screen isn’t going to help much as what you see for output sharpening to a printer on screen isn’t what you’ll get (it isn’t WYSIWYG).

    You're obviously way beyond me in your expert opinion. However if one starts with a poor file....hence why I asked to see it first.
    tom wise
  • Options
    OverfocusedOverfocused Registered Users Posts: 1,068 Major grins
    edited May 13, 2012
    angevin1 wrote: »
    You're obviously way beyond me in your expert opinion. However if one starts with a poor file....hence why I asked to see it first.

    You're right. Looking at the file definitely will allow you to conclude if there are any problems in whats being sent for printing.

    And what he was talking about with WYSIWYG - Monitors are 72DPI displays for the most part. The industry professional printing standard is 300DPI for paper. Some printers even ask for 350DPI. So technically, if its sharp on his monitor, it should be even sharper in print. Maybe he's not sizing it to the exact dimensions of the print and letting the printer resize it without doing it optimally. That can degrade print quality. Unless the printer does the upsampling for you in a way that results in quality, sending an image lower than a printer's recommended resolution will most definitely result in a dull photo concerning sharpness. The same can go for downsampling, since certain methods will result in sharper downsampled images.


    Otherwise, it may just not be a method that reproduces micro-details as well. I've got a couple metal prints myself and think they look fantastic but, there's not super micro details like feathers. Still, its really damn sharp, vibrant, and beautiful.
  • Options
    arodneyarodney Registered Users Posts: 2,005 Major grins
    edited May 13, 2012
    Monitors are 72DPI displays for the most part.
    Not really (not for a long time). It is super easy to figure out exactly the output resolution of your display. Measure the width of your display and divide that by the number of pixels it‘s displaying.
    For example, on my NEC 3090, the width is 25.25 inches. Its resolution is 2560x1990. 2560/25.25=101.4 dpi.
    The industry professional printing standard is 300DPI for paper.
    Where would one find out about this printing standard? For what output devices?
    So technically, if its sharp on his monitor, it should be even sharper in print.
    Should be the opposite. Images that look too sharp and ‘crunchy’ on screen can print looking fine.
    Andrew Rodney
    Author "Color Management for Photographers"
    http://www.digitaldog.net/
  • Options
    OverfocusedOverfocused Registered Users Posts: 1,068 Major grins
    edited May 13, 2012
    Yeah you're right about the 72DPI thing. Its an old standard that I still misuse the term on. I'm only 25 but I can remember using computers since 1991, lol. 72 is the set standard for web files and jpegs, anyway.


    For 300DPI, its all over on many websites. 300DPI is the general standard for professional CMYK and inkjet alike, and %99.9 of them will recommend specifically for 300DPI or higher for photo prints. And, it was pretty much the first thing I was told in school by my professor. This website puts the best info together though:

    http://www.digicamguides.com/print/ppi-print-size.html

    A print at 300dpi is 1/3 to 1/4 the size of a monitor display, so its sharper but smaller. For the smaller final output size it will condense sharper than %100 on the monitor. So, if hes getting it sharp at %100 on the monitor, he technically should be getting some really sharp stuff in print.



    A big reason 300DPI is a standard is because that's where human perception starts to not see much of a difference. For some processes, ink bleeds and thats about as high as it goes in visual difference. This probably happens with the metal prints since its a gas and not micro-droplets. I can tell the difference in areas with super micro detail above 300DPI with inkjet printing, but thats requires a photo really packed with detail, and requires your face to be inches from the print, lol.
  • Options
    arodneyarodney Registered Users Posts: 2,005 Major grins
    edited May 13, 2012
    For 300DPI, its all over on many websites. 300DPI is the general standard for professional CMYK and inkjet alike, and %99.9 of them will recommend specifically for 300DPI or higher for photo prints. .

    Actually no. So here’s the deal. If you’re working with a halftone output, there is what is known as the quality factor or Q factor**. You first find out what linescreen is used for printing. It could be low (newspaper) at around 85lpi, it could be much higher for a good quality glossy sheet (185lpi+). At about 150lpi or lower, you double the quality factor to 2X. Most commercial quality, glossy magazine output uses a 150lpi so with a 2X QF you’d want a file that has 300ppi. You can use more which buys you nothing but a larger file. And in the old days of trying to transfer data to a service bureau on a modem, the difference between 300ppi and even 350ppi was significant.

    Hence the old “use 300 dpi” (it is really 300ppi) recommendation. Super overkill for news print, you’d use something more like 170ppi.

    Now as the linescreen gets finer, above 150lpi, you could drop the quality factor to 1.5X. So if you were lucky enough to be using a super high quality print process and a 200lpi screen, you’d multiple that by 1.5 (you would not send 400ppi data). And this is all based on a halftone process. Ink jets and other contone printers are quite different. We don’t send 2880ppi data to an Epson that can output 2880dpi for a lot of reasons.
    A big reason 300DPI is a standard is because that's where human perception starts to not see much of a difference.

    But it isn’t a standard that that isn’t true for all kinds of halftone output. No one in their right mind would send 300ppi data to a press printing newsprint.

    If you go way back to the days of one of the first contone digital printers, the Kodak XL7700 (which in the early 90’s I owned), it’s output resolution was exactly 203dpi. Why? Because Kodak determined that this was the minimum data to send to get a continuous tone output of sufficient quality. Keep in mind, in early 1990, the difference between an 8x10@203 and an 8x10@300 was more than enough to bog down Photoshop running on an 8mb Mac IIci!

    **http://books.google.com/books?id=j7l0CxBUwGwC&pg=PA107&lpg=PA107&dq=LPI+Q+factor&source=bl&ots=xvG2wvlRaR&sig=wv4Rg_Ta3vv-68AgBK-b1V_3n20&hl=en&sa=X&ei=lxawT-enFcidiQKe14SABA&ved=0CFAQ6AEwAg#v=onepage&q&f=false
    Andrew Rodney
    Author "Color Management for Photographers"
    http://www.digitaldog.net/
  • Options
    arodneyarodney Registered Users Posts: 2,005 Major grins
    edited May 13, 2012
    This website puts the best info together though:

    http://www.digicamguides.com/print/ppi-print-size.html

    Well the web site is wrong in regard to display resolution as we both agree upon:
    Your computer monitor displays images at 72 pixels per inch.

    Ah maybe his does. Mine clearly doesn’t and your’s probably doesn’t either.
    Andrew Rodney
    Author "Color Management for Photographers"
    http://www.digitaldog.net/
  • Options
    OverfocusedOverfocused Registered Users Posts: 1,068 Major grins
    edited May 13, 2012
    arodney wrote: »
    Actually no. So here’s the deal. If you’re working with a halftone output, there is what is known as the quality factor or Q factor**. You first find out what linescreen is used for printing. It could be low (newspaper) at around 85lpi, it could be much higher for a good quality glossy sheet (185lpi+). At about 150lpi or lower, you double the quality factor to 2X. Most commercial quality, glossy magazine output uses a 150lpi so with a 2X QF you’d want a file that has 300ppi. You can use more which buys you nothing but a larger file. And in the old days of trying to transfer data to a service bureau on a modem, the difference between 300ppi and even 350ppi was significant.

    Hence the old “use 300 dpi” (it is really 300ppi) recommendation. Super overkill for news print, you’d use something more like 170ppi.

    Now as the linescreen gets finer, above 150lpi, you could drop the quality factor to 1.5X. So if you were lucky enough to be using a super high quality print process and a 200lpi screen, you’d multiple that by 1.5 (you would not send 400ppi data). And this is all based on a halftone process. Ink jets and other contone printers are quite different. We don’t send 2880ppi data to an Epson that can output 2880dpi for a lot of reasons.



    But it isn’t a standard that that isn’t true for all kinds of halftone output. No one in their right mind would send 300ppi data to a press printing newsprint.

    If you go way back to the days of one of the first contone digital printers, the Kodak XL7700 (which in the early 90’s I owned), it’s output resolution was exactly 203dpi. Why? Because Kodak determined that this was the minimum data to send to get a continuous tone output of sufficient quality. Keep in mind, in early 1990, the difference between an 8x10@203 and an 8x10@300 was more than enough to bog down Photoshop running on an 8mb Mac IIci!

    **http://books.google.com/books?id=j7l0CxBUwGwC&pg=PA107&lpg=PA107&dq=LPI+Q+factor&source=bl&ots=xvG2wvlRaR&sig=wv4Rg_Ta3vv-68AgBK-b1V_3n20&hl=en&sa=X&ei=lxawT-enFcidiQKe14SABA&ved=0CFAQ6AEwAg#v=onepage&q&f=false

    I'm talking about industry standard printing for photographers as per the OP's method and in general what photographers use for high definition artwork, and those printers that aim to service them. That's it. Not %99.9 of all printing in existence and most definitely not newspaper and magazine print.
  • Options
    arodneyarodney Registered Users Posts: 2,005 Major grins
    edited May 14, 2012
    I'm talking about industry standard printing for photographers as per the OP's method and in general what photographers use for high definition artwork, and those printers that aim to service them. That's it. Not %99.9 of all printing in existence and most definitely not newspaper and magazine print.

    There is no such industry standard, that’s why I asked you to come up with some source for such standards. What standards body (ISO or similar) has stated that 300dpi is an output standard? It isn’t. Calling it one makes it no more a standard.
    Andrew Rodney
    Author "Color Management for Photographers"
    http://www.digitaldog.net/
  • Options
    pathfinderpathfinder Super Moderators Posts: 14,697 moderator
    edited May 14, 2012
    Andrew, is there an intrinsic reason to use a different sharpening technique for a print on metal ( aluminum ) versus a print on paper, say a premium lustre paper?

    Or does this depend on whether it is a high frequency image or a low frequency image?
    Pathfinder - www.pathfinder.smugmug.com

    Moderator of the Technique Forum and Finishing School on Dgrin
  • Options
    arodneyarodney Registered Users Posts: 2,005 Major grins
    edited May 14, 2012
    pathfinder wrote: »
    Andrew, is there an intrinsic reason to use a different sharpening technique for a print on metal ( aluminum ) versus a print on paper, say a premium lustre paper?

    I would think so. We’d use a different sharpening for matt vs. glossy so I suspect this metal substrate would necessitate a different output sharpening.
    Andrew Rodney
    Author "Color Management for Photographers"
    http://www.digitaldog.net/
  • Options
    AdamideasAdamideas Registered Users Posts: 30 Big grins
    edited May 16, 2012
    Cool, thanks for the reply's guys. I've been crazy busy but will respond when things calm down next week.
Sign In or Register to comment.