Given that he already has the robot which can paint, wouldn't it be easy (well, doable at least :) to really paint the picture instead of printing it? There is no reason why this kind of painting should be any different from the manually produced.
There is a missing piece of automation here still.
The robot can replicate recorded movements - brush angle, pressure, timing etc.
An photoshop filter can shift pixels around to make a photo look like an oil painting.
What nobody has come up with yet (and would likely be hard to develop, or at least costly) is a way to translate a photograph, or a photoshop filter into the set of movements required to render it in physical oil paint.
Until we get 3D printers with an atom-level precision, you can't really reproduce a brush stroke with a vertical nozzle, so you'd need to somehow identify those strokes from the depth scan.
Software already exists to decompose a painting or drawing into a series of strokes (drawings are obviously easier). The examples I've seen were all 2D, but I am given to understand that 3D equivalents exist as well.
Getting a robot to actally lay down each stroke with paint is left as an exercise for the reader. ;-)
All the software I've seen (and I'd be fascinated to see anything I've missed) does not work in a way (afaik) that would be useful for a robot painter.
Photoshop filters and more advanced converters like DeepArt produce something which look like strokes, but have no concept of strokes. Illustrator convert to vector/stroke/outline uses a concept of stroke which makes no accounting for the drag and physics of a brush, and may not even by physically possible to produce.
There are, of course, some conversions which we can achieve already. We have pen-based plotters and could write simple contrast/cross hatching converters etc. But these are qualitatively different from scanning an image and converting it into the strokes for a robot arm.
Additionally, in terms of desktop graphics editors such as Photoshop and Illustrator, rest assured that Fractal Design Painter definitely had a model that incorporated (well, approximated anyway) medium, angle, brush size, pressure, direction changes, the color you were painting over, etc. 20 years ago, and the latest version of Corel Painter has added features such as particle systems and modeling individual bristles (and other features have been added in the interim).
The intent in Painter is to enable the creation of new works in a way that feels natural to artists used to traditional media (not to mention reproducing the appearance of those media), which isn't quite what we're talking about here, but hopefully this demonstrates that such models do actually exist.