mjb3030 Posted April 10, 2005 Posted April 10, 2005 I am trying to retrieve a web page using tcpclient. It works, but if the page is very long, it always gets cut off. For example, I can receive the full page for http://www.google.com, but if I retrieve a google search page, the server disconnects halfway through and I only get half the page. Here is my code: client = New TcpClient client.Connect(txtURL.Text, 80) stream = client.GetStream Dim send() As Byte = Encoding.ASCII.GetBytes("GET / HTTP/1.0" & vbCrLf & _ vbCrLf) stream.Write(send, 0, send.Length) Dim recv(client.ReceiveBufferSize) As Byte While Not stream.DataAvailable Application.DoEvents() End While While stream.DataAvailable stream.Read(recv, 0, client.ReceiveBufferSize) Dim s As String = Encoding.ASCII.GetString(recv) TextBox1.Text = TextBox1.Text & s End While What should I be doing differently? (Besides eliminating the infinite DoEvents loop) Quote
Mister E Posted April 10, 2005 Posted April 10, 2005 Make things easier by using the WebClient class for this. I typically do something like this (in C#):WebClient wc = new WebClient(); byte[] buffer = wc.DownloadData( @"http://www.cnn.com" ); return System.Text.ASCIIEncoding.ASCII.GetString( buffer ); Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.