It supports only a subset of the .OBJ file format, but even the shader editor itself does not support complex models. OBJ models are interpreted as a set of faces and therefore groups of faces are ignored. That leads to the problem, that it is not possible to load material files, because the loader can't decide which of the several materials in a .MTL file to use. This applies to colors and ( even worse ) to textures. Although the .OBJ format supports texture coordiantes, these coordinates point into a texture map specific to the bodygroup to which a vertex belongs. This leads to strange results when applying textures to that model in the shader editor.
If your model does not include texture coordinates, these coordinates are automatically generated by sphere mapping. The vertex positions are projected on the unit sphere and then are mapped to a texture that is 'wrapped' around that sphere. This results in poor quality, but it's still better than having no texture coords at all...
The same counts for normals. When a model misses vertex normals, these are generated by calculating the surface normals of all incident triangles, adding them and then normalizing the result. This leads to normals with a 'smooth' lighting effect.
Because different models have different bounding volumes, the model's vertex coords are scaled and offsetted to fit into a cube from (-1,-1,-1) to (+1,+1+1) with the model's bounding box center set to the origin. This ensures that the model fits into the view.
Vertex colors are created from the transformed vertex positions. The coordinates transformed position are taken to their absolute values and the result is subtracted from 1.0. This leads to a vector in an inverted RGB cube.