Jump to content
Xtreme .Net Talk

Recommended Posts

Posted

I am trying to retrieve a web page using tcpclient. It works, but if the page is very long, it always gets cut off. For example, I can receive the full page for http://www.google.com, but if I retrieve a google search page, the server disconnects halfway through and I only get half the page. Here is my code:

 

       client = New TcpClient
       client.Connect(txtURL.Text, 80)
       stream = client.GetStream
       Dim send() As Byte = Encoding.ASCII.GetBytes("GET / HTTP/1.0" & vbCrLf & _
           vbCrLf)
       stream.Write(send, 0, send.Length)
       Dim recv(client.ReceiveBufferSize) As Byte
       While Not stream.DataAvailable
           Application.DoEvents()
       End While
       While stream.DataAvailable
           stream.Read(recv, 0, client.ReceiveBufferSize)
           Dim s As String = Encoding.ASCII.GetString(recv)
           TextBox1.Text = TextBox1.Text & s
       End While

 

What should I be doing differently? (Besides eliminating the infinite DoEvents loop)

Posted

Make things easier by using the WebClient class for this.

 

I typically do something like this (in C#):

WebClient wc = new WebClient();
byte[] buffer = wc.DownloadData( @"http://www.cnn.com" );

return System.Text.ASCIIEncoding.ASCII.GetString( buffer );

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...