TECHNOLOGIES
FORUMS
JOBS
BOOKS
EVENTS
INTERVIEWS
Live
MORE
LEARN
Training
CAREER
MEMBERS
VIDEOS
NEWS
BLOGS
Sign Up
Login
No unread comment.
View All Comments
No unread message.
View All Messages
No unread notification.
View All Notifications
Answers
Post
An Article
A Blog
A News
A Video
An EBook
An Interview Question
Ask Question
Forums
Monthly Leaders
Forum guidelines
sim085
NA
11
0
Socket Programmin (Problem)
May 10 2005 6:33 AM
Hello, I have builded a Class that uses sockets to both connect and recieve data. Now I wanted my classes to only accept a connection from a known application, and thus I added an other class ("ValidationClass") that all it does is create a connection with the remote location, andsend a string to be validation. The other side would then check the data recieved and accept or recject the connection. However I have a problem. When I test everything, my server application does not tell me whether the connection has been accepted or not. However when I DEBUG my client side application, my server side tells me the correct result (Whether it was accpeted or not). Now the reason I can think why on DEBUG it works fine is that I take time to press F10 and the connection can be made correctly!! However I do not know how to make sure that my server side is initialized from the client side :( or vice versa Does anyone have any ideas? Thanks in advance for any help you may provide
Reply
Answers (
2
)
C# accessing network folders
Representing an object of a class in XML