As photography has become more and more acknowledged over the years, there are actual universities who train people to capture photos. After course completion, they hand out degrees just as any other college would. In the USA, there are a number of universities that are known for giving out photography degrees. For instance, the small Baptist College that is located in North Carolina hands out a photography degree.
There are a number of other universities that have entire programs to educate young photographers. They offer diplomas and degrees in the area. Part of the reason why so many photography colleges have opened up is because it has become a booming business.
The United States certainly has its fair share of universites. Some of the best and most decorated universites in the world are in the United States. That being said, there are a huge amount of universites in the USA who give out degrees in photography. A very huge amount of schools in the States have liberal arts educational backgrounds. This means that they focus their efforts and offerings on all kinds of disciplines.
Photography is a very vital discipline that makes the world go around. The small Baptist college that I went to in North Carolina gave out a degree in photography. All of the major websites and print publications in the United States have staff photographers on hand. You can bet that each and everyone of them who come from the United States most likely got a degree in photography from a university from the USA.
The photography programs are very vital to these universites. The reason for this is that photography has become a business that has done nothing but skyrocket. Another thing is that there will always be a need for photographers. So in closing, the answer to this question would be without a doubt that yes that universites in the United States give out degrees in photography.