To programmers, benefits of source control are obvious. Revision history, revert and recovery, team work, build and continuous integration tools integration, and so on and so forth. No one is questioning usage of source countrol any more. It all boils down to a simple rule: if it is not in source control, it does not exist. Build and continuous integration tools expect it to be in source control. If it is not there, it will not be built. If it was not built, it will not be deployed. If it was not deployed, it will not be tested. If it was not tested, it will not be put to production.
If developers understand the importance of source control, why isn't it used by others working alongside them? While there are many infrastructure, database, operations and other non programming experts using source control, it is still not a rule that everything needs to be committed before it's used.
It is fairly common practice that infrastructure expert sets up a server according to required specifications. Knowledge what was done stays with that expert. Nothing was commited to source control, same process can not be repeated. Changes can not be tracked. In other words, what was done exists only in the head of person who did it. Ideally, server setup should be part of continuous integration process. Files are commited to source control, build is executed from CI (Jenkins, Hudson, Bamboo...) and server is ready, tested and, finally, deployed to production. If this procedue applies to development, there is no reason why it should not apply to infrastructure.
There can hardly be any excuses not to follow development practices. In case of infrastructure tasks, tools are already there. It can be a set of simple scripts, configuration files written in Puppet or Chef or any other solution. Source control allows us to have documented, tracked and repeatable processes. Which tools are used is of secondary importance.
What really matters is that "if it is not in source control, it does not exist".