While our smartphone photo tests are mostly about flagships, the Pixel 4a deserves our attention. Because this terminal wields a weapon of choice: it promises “Google Pixel” image quality for a moderate price (349 euros). Facing the five camera modules of a high-end terminal such as the Huawei P40 Pro Plus, the Pixel 4a shows obvious material sobriety with its unique wide-angle module. A 27 mm equivalent opening at f / 1.7, optically stabilized and equipped with a classic 12 Mpix sensor in 1 / 2.55 inch format.
Nothing new, nothing exceptional, but a rustic module with proven photographic qualities. So the question was not to compare the Pixel 4a to devices three times the price, but to determine if it was the ideal “popular compact”.
Whether for reasons of cost, aesthetics or consistency of product range or anything else, the optical unit of the Pixel 4a takes the visual code of its big brothers. Namely, for its photo module, a protuberance that takes the form of a square glass with rounded edges. From a pure operational point of view, it’s a safe bet that it was unnecessary – the extra millimeter of thickness does nothing for a wide angle – but at least you recognize the “pixel” key. Closer, we can see the optical unit itself, as well as an LED for flash and torch. Equipment worthy of a Iphone 6 ! Our biggest regret is obviously the lack of an ultra wide angle, so practical indoors or for landscapes.
This simplicity of the equipment is reflected in the “Camera” application, more inspired by the Apple app than that of Huawei or Samsung. The Pixel 4a is minimalist and should be seen as a camera with no other pretensions than producing beautiful images quickly – which still covers the needs of the majority of users.
Image accuracy: the power of visual computing
When it comes to photos, Google’s teams believe that data is the main vehicle for improving image quality, ahead of the nature (and quality) of optics and sensors. If this point of view has its limits – in terms of digital zoom in particular – it also has its strengths.
Based on an ordinary hardware partition – a single camera module, equipped with a classic 12 Mpix sensor and standard wide-angle optics – Google’s software engineers therefore put all their energy and know-how into the processing. software.
And it pays: in a number of situations, the image sharpness is often excellent, largely at the level of the latest iPhone 11 Pro. In the majority of scenes, the camera interprets textures very well and is perfectly capable of artificially recreating a detachment of shots.
All this is done at the cost of a certain amount of image processing time: once a snapshot is taken, a small icon is displayed in blue in the preview of the photo app’s gallery to signify that the algorithms are “spinning” »The final image (6s on average). Nothing annoying since you can already preview the photo and the camera remains operational during the processing which takes place in the background.
The limits of software processing
By looking at the pictures with a magnifying glass, we quickly detect areas where the image processing is too strong: here an HDR which generated digital noise in the shadows, there plants rendering a little too soft compared to parts metal that “click”.
Or a car in a nocturnal landscape which, magnified to 100%, looks like a block of pixels. Even when faced with very little moving subjects such as the stars, the images analyzed at more than 100% on a screen show processing faults.
Another problem with algorithms is the fuzzy background simulation. And this, especially on subjects known not to help photographers, namely children and animals. Cats included, as Les Nuls perfectly demonstrated, “Cats are really wankers” (in French in the text). So our friend Mirage, pictured here, gave the Pixel 4a algorithms no chance, trapping them several times.
In general, and excluding the bokeh effect, digital “flaws” are only observed 100% on a 27-inch screen when nitpicking. Within the framework of a “normal” consultation, that is to say viewing on the screen of the smartphone, printing of a 10×15, or even a wallpaper, it is very difficult for the common man to detect anything. .
Excellent colors, a little strong smoothing in low light
The other victory stemming from Google’s top priority for software over hardware: color quality. Pixels have been leading the way in color management quality for three generations of devices.
Difficult lights like tree foliage are no problem for the Pixel 4a, which is able to faithfully render natural colors – no bluish touches or artificial warming of hues. Even indoors with several color temperatures (incandescent + halogen for example), the Pixel 4a produces a white balance in the middle. On our various test images, we were never able to find fault with the color rendering engine, which is excellent.
The rise in high sensitivities, on the other hand, is accompanied by a very strong smoothing which eliminates most of the details. If the goal is obviously to reduce digital noise, the operation is paid here by the disappearance of the depth effect and the annihilation of textures and reliefs. Here Google pays for its unique and small sensor.
Video: effective stabilization at the cost of a lot of cropping
Revolutionary with the release of the first Pixel, the video stabilization of the Pixel 4a is still effective and allows you to shoot viewable sequences without having to resort to an external stabilization system. But “old” algorithms require – Google is clearly no longer at the forefront in the field – the reframing is still as violent. Understand that we lose a lot in angular coverage and that the initial wide angle equivalent to a 27 mm is then closer to 45 mm.
Autofocus: the DNA of an iPhone 4S
On the autofocus side, the Pixel 4a is reminiscent of the iPhone 4S, the first smartphone to take the following view “more of a blurry image than no image at all”. It must be said that at the time (2011), the contrast detection autofocuses of smartphones were so popular that many devices were triggering a thousand years after the battle, if at all.
The flip side of this is the low-light behavior: whether it’s ready or not, whether it’s taking stock or not, the Pixel’s “Camera” software will trigger. Behavior that encourages the user to pay less attention to the stability of their frame – “if the device fires, it’s ready.” And the most trembling (or least patient) public to suffer from blurry shots as soon as the light shines.
Fortunately, smart users will quickly understand that the “optimistic” behavior of the Pixel 4a must be offset by a “traditionalist” approach: I check that the image on the screen is sharp, and my hand has stabilized for at least one or two. seconds before firing. Under its conditions, the Pixel 4a rarely misses its mark.
With the Pixel 4a, Google is launching what is arguably the best image quality-price ratio in the world of photophones, its camera module being roughly similar to that of its big brothers. The undeniable asset is obviously the quality of exposure and the excellence of the colors, which are benchmarks in the world of smartphotography.
But the photo score of the Pixel 4a remains limited: with a single camera module, it does not offer the creative palette of high-end terminals (different focal lengths, sophisticated blurred backgrounds, etc.), nor the refinements that the big ones provide. sensors (great definitions, low light performance). The Google Pixel 4a has to be looked at as the perfect compact for everyday photography. And at 349 euros, it’s already a great feat.