I'd like to begin by apologizing if this topic is not of interest to a lot of people; however, when I began my journey with .NET, there was very little to guide me so I could begin to program using Visual Studio. Currently, I do not program any user interfaces. I focus I creating API's for Braille translation that sighted developers can use to provide apps (mobile or desktop), that add Braille output capabilities. My goal is to make the process easy by abstracting the complexities of Braille, so that developers don't have to worry about such things. They would only call an API with two lines of code (which includes the using statement in C#, or the Imports statement in VB.). So, how do I plan to accomplish such a goal?
I will now describe my set of tools. The most important tool I use (besides Visual Studio), is a screen reader called NVDA. It is the best screen reader for Visual Studio interaction. It let's me know about the items that appear when I use the editor's autocomplete feature. NVDA also tells me about the menu items, treviews, and lists within the IDE. This is all done via a text-to-speech engine that retrieves the required information from the desired program as it opens.
My next tool is the Github for Windows shell. This also works with NVDA very well, and there's not much to say about it, except for the fact that it is fun to "commit, push and pull!"
When I want to test a the output of a particular class or method, I write a simple Windows Forms app with two text boxes and a couple of lables, along with a couple of buttons. On one text box, I write the input, and I implement an event handler to shoot the output on the second text box so I can analyze it.
I'd like to conclude by saying that software development tools can sometimes be unfriendly to screen readers, therefore, they are not usable by blind developers. This is why .NET is one of the best choices for people who are visually impaired and want to develop software.