I have an application (windows form) that i created that does http webscraping and stores specific information into a datagrid.
i have noticed that users who view the web via proxy server complain that the application does not work for them.
how do i overcome this issue?
i am reasonably new to .net but have a great understanding of it and OOP programing (and im very familiar with vbscript in asp).
could someone help me, be showing me how to make my scraping work through a proxy?
my scraping function looks like this:
Imports System.Net
Imports System.IO
Public Class ForumParser
Public Function GetHTML(ByVal URL_ As String) As String
Try
Dim _HTTPWebRequest As HttpWebRequest = HttpWebRequest.Create(URL_)
Dim _HTTPWebResponse As HttpWebResponse = _HTTPWebRequest.GetResponse()
Dim _StreamReader As New StreamReader(_HTTPWebResponse.GetResponseStream())
Return _StreamReader.ReadToEnd()
Catch
Return ""
End Try
End Function
End Class
I'd really appriciate any help anyone can give, thanks in advanced.
-WizzKidd
i have noticed that users who view the web via proxy server complain that the application does not work for them.
how do i overcome this issue?
i am reasonably new to .net but have a great understanding of it and OOP programing (and im very familiar with vbscript in asp).
could someone help me, be showing me how to make my scraping work through a proxy?
my scraping function looks like this:
Imports System.Net
Imports System.IO
Public Class ForumParser
Public Function GetHTML(ByVal URL_ As String) As String
Try
Dim _HTTPWebRequest As HttpWebRequest = HttpWebRequest.Create(URL_)
Dim _HTTPWebResponse As HttpWebResponse = _HTTPWebRequest.GetResponse()
Dim _StreamReader As New StreamReader(_HTTPWebResponse.GetResponseStream())
Return _StreamReader.ReadToEnd()
Catch
Return ""
End Try
End Function
End Class
I'd really appriciate any help anyone can give, thanks in advanced.
-WizzKidd