What Are Digital Images?
Digital Images are a Mosaic of Square Pixels
We all regularly use digital images but if we are real we do not really know what they are at the core. Digital images contain pixels which are basically like a mosaic. When you really zoom in at an extreme level you really cannot even make out what you are looking at most of the time. The reason why we think this matters is that quality is getting redefined. If you really think about what photography is you will see that most people get getting things backwards. Photography is equally, if not more, about what you do in camera than what you do afterward.
The Hardware Baseline
This could get very technical but rather we want it to remain rather practical. Pixels vary in their width and are captured via a sensor. In proposal to wedding photography this is somewhat a double edged sword. The more pixels you have the more sharp images will be for their intended use. Conversely, the more sharp the more you will see any imperfections. The other thing to say on pixels is that there is a marketing game going on when you divide something in quarters with software. This is how you see something 12MP go to 48MP without a hardware change.
What we have noticed is that the photography industry has stayed somewhat plateaued in a hardware image resolution over the years. We think it’s more of an impracticality of going higher than a certain point. We do know that going from D800/D810s to the D850s that we had to up our processing power and storage to handle them. Even now we wonder if people really need images as big as we provide them.
What is the Difference?
Below is a comparison on the cameras we employ and a common personal phone couples have at the time of writing. We are sharing these numbers to just be illustrative of the difference between devices.
- D850: 8256 x 5504 pixels | 46.89 million pixels
- D800/810: 7360 x 4912 Pixels | 36.8 million pixels
- iPhone 13 Pro: 4032 × 3024 pixels | 12.19 million pixels
- iPhone 14 Pro: 8064 x 6048 pixels 48.7 million pixels
Notice the 2x numbers = 4x pixel increase in the iPhone 13 Pro to 14 Pro? This is said to be due to “a new machine learning model designed specifically for the quad-pixel sensor, iPhone now shoots ProRAW at 48MP with an unprecedented level of detail, enabling new creative workflows for pro users.” In other words software makes stuff up that is not captured via hardware.
The take away is the pixels start to become a point of confusion with some software manipulation and marketing. For example, you could get a bigger image on a wide angle through digital stitching together of images but what we have seen is the baseline hardware is still a 12MP sensor. Even on the Google Pixel 6 Pro at 50MP seem to only implore a 12MP sensor while using a Quad Bayer Sensor. It is a bit technical of a read but what we got out of that is that it giving marginal resolution gains by breaking a pixel down into 4ths.
Implications of Pixels
We are going to define pixels as ‘real’ hardware pixels. Software can do some amazing things. If it’s simultaneous with the photo being taken you likely do not even know what is happening. Here is a takeaway – if a phone can use software to make something >4x the theoretical quality how much more when the hardware has that quality without software?
On Sharpness
Something being clear to the eye will depend on how big it is viewed at, how it was captured, and how well trained is your eye. Most people are viewing images really small on phone or computer screens. Online images on a website as this one are 1% of the actual size. All that to say that our joke of being able to make a billboard is rather accurate. There are more pixels there than most will ever want or need. Does this mean we should be sloppy or scale things down? Well some might but our take is to deliver the highest quality.
The sensor is one aspect to sharpness but then there is the lens that captures it. There is a varying ability of glass to allow light through to have high quality. This is something that personal phones cannot make up via software. This is in part why you seem multiple ‘lenses’ on the back of a phone. Lenses are art with intentional design. You can have an amazing camera sensor and put junk (kit) lens on it and lose so much quality.
On Editing
Better in – Better out is how we would summarize it. There are different ideas of editing (How we edit images). Most think of editing as the creation / removal level of editing. What most do not think of is that the creation and photo manipulation is a pixel quality destruction. Based on what is above having the best quality hardware (sensor, lenses) and shooting well with it means you can do more without impact. Can do, should do, and need to do are all different things so we certainly recommend the other blog on editing be read.
What was the Mosaic?
Have you figured out what was in the photo above? It was an eye lash and some trees. There was no photo manipulation / software creation.