<

Go Premium for a chance to win a PS4. Enter to Win

x

ASP.NET Proxy Page – Used for Cross Domain Requests from AJAX and JavaScript

Published on
16,622 Points
9,622 Views
5 Endorsements
Last Modified:
One of the pain points with developing AJAX, JavaScript, JQuery, and other client-side behaviors is that JavaScript doesn’t allow for cross domain request for pulling content. For example, JavaScript code on www.johnchapman.name could not pull content or data from www.bing.com.  I have found this particularly painful when doing SharePoint development.

One way to overcome this issue is by using a server-side proxy on the site running the JavaScript code. There is already are well documented PHP solutions on the web, however I couldn’t find very many .NET-based solutions. This simple C# code takes the URL passed to it through the URL encoded query string, retrieves the content of the URL using an HttpWebRequest and outputs in the original form it as if it were content on the site.

The following code can be added to an ASP.NET application or to SharePoint.  For SharePoint, you can place the two files (Proxy.aspx and Proxy.aspx.cs) into the "_layouts" folder (c:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\TEMPLATE\LAYOUTS).

Please Note: To keep this example simple and to the point it does not contain any error exception handling and such.  You can modify this code and add error handling as you see fit.  Also, allowing all websites to be proxied through your site could allow for malicious site content to appear as if they are coming from your site.  The Query String method below is not recommended for use on publicly accessible websites, instead use an ID in the Query String and lookup the actual URL from a database.  This will put you in complete control of the sites allowed to be proxied.


 Complete Proxy.aspx.cs
using System;
using System.Collections.Generic;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Net;
using System.IO;

namespace Proxy {
    public partial class _Proxy : System.Web.UI.Page {
        protected void Page_Load(object sender, EventArgs e) {
            string proxyURL = string.Empty;
            try {
                proxyURL = HttpUtility.UrlDecode(Request.QueryString["u"].ToString());
            }
            catch { }

            if (proxyURL != string.Empty) {
                HttpWebRequest request = (HttpWebRequest)WebRequest.Create(proxyURL);
                request.Method = "GET";
                HttpWebResponse response = (HttpWebResponse)request.GetResponse();

                if (response.StatusCode.ToString().ToLower() == "ok") {
                    string contentType = response.ContentType;
                    Stream content = response.GetResponseStream();
                    StreamReader contentReader = new StreamReader(content);
                    Response.ContentType = contentType;
                    Response.Write(contentReader.ReadToEnd());
                }
            }
        }
    }
}

Open in new window



Proxy.aspx.cs Breakdown

You will need to make sure at least the above listed "using" references have been added to the code-behind and that code is added to the Page_Load section of the code-behind.
Since we are using a Query String to determine which page we are retrieving for the proxy in this example, we will need to get the value of the Query String and if no value is found, not to attempt to retrieve anything.  If you wanted to you could hard code a URL in the code if it was not going to be reused for any other pages.  Also, you could store the URL's in a database and use an ID in the Query String rather than putting the URL in the Query String.  It is important to remember that Query Strings only support certain characters.  If you have special characters it is important to URL Encode the Query String.  This site can be used to create URL Encoded strings for you: http://www.albionresearch.com/misc/urlencode.php.
If a value was found for the Query String in this example, we will create a new HttpWebRequest.  If you would like more information on the HttpWebRequest class, visit the following MSDN page: http://msdn.microsoft.com/en-us/library/system.net.httpwebrequest.aspx.
Using the request's GetResponse() method, we get a response (the remote page content).  However, we first need to make sure that we received a valid response ("ok").  We might not receive a valid response if we receive a 404 Not Found, 500 Internal Server Error, 401 Unauthorized or other normal HTTP error.
If our response is OK, we use the Stream (data) of the response and output it to the page content.

Note: After the "if (response.StatusCode.ToString().ToLower() == "ok") { … }" snippet, you could add an else statement to output the StatusCode to an error log so that you can troubleshoot if necessary.  

This tutorial shows a simplified method for retrieving remote content and displaying it on your site so that AJAX, JQuery, JavaScript, etc can access to remote content.  When using this code, it is important to consider security concerns and error handling to ensure the best implementation of a proxy page.


Proxy.aspx
<%@ Page Language="C#" 
         AutoEventWireup="true" 
         CodeBehind="Proxy.aspx.cs" 
         Inherits="Proxy._Proxy" %>

Open in new window


The Proxy.aspx page is simply blank except for the Page tag. When passing the URL to the query string, it is important that it is URL encoded. This helps to prevent query strings of the remote site URL from interfering with the Proxy page.


Example Usage
http://www.yourdotnetapplication.com/proxy.aspx?u=http%3a%2f%2fwww.google.com
http://www.yoursharepointsite.com/_layouts/proxy.aspx?u=http%3a%2f%2fwww.google.com

Open in new window



The original article as well as source code can be found at:
http://www.johnchapman.name/aspnet-proxy-page-cross-domain-requests-from-ajax-and-javascript/
5
Comment
Author:chapmanjw
3 Comments
 
LVL 26

Expert Comment

by:arober11
May be of interest, if the Remote site sends / sets a few "Access-Control-Allow-*" headers, see:
https://developer.mozilla.org/en/HTTP_access_control , most browsers won't have any difficulty requesting the content / need the proxy solution above.

If your REMOTE server is running Apache you can achieve this by loading the "headers" module and adding something along the following lines to the remote httpd.conf:

# Allow remote Client side AJAX Calls to this site
Header set Access-Control-Allow-Origin "*"
Header set Access-Control-Allow-Methods "*"
Header set  Access-Control-Allow-Headers "*"

Open in new window


If you don't have access to the remote server, and your LOCAL web server is Apache based, you can achieve the above functionality, plus a bit more, by using the "proxy" module to set-up a ReverseProxy.  Just add something along the following lines to your local httpd.conf, and replace any URL's of the form "http://www.some.webservice/remote_script.php", in the javaScript with: "/proxy/remote_script.php":


Note: In addition to GET's this proxy solution will also handle any POST or OPTIONS submissions from the Browser, may be useful if your trying to add something like an: Exchange Rate , Stock Price, or Delivery rate Lookup form / function to your site.

If your running IIS and need to support more than a GET request, then the following ASP.NET solution may be of some interest: http://www.codeproject.com/KB/web-security/HTTPReverseProxy.aspx
0
 
LVL 60

Expert Comment

by:Kevin Cross
Yes that is a function of the HTML5 XMLHttpRequests, so it is emerging technology that will present some more power in HTML code but also keep us on our toes with regard to security.

I mentioned this also in my Article Cross-Site Exploitations which is something to keep in mind when opening up this functionality.

By the way, you have my Yes vote above.

Kevin
0
 

Expert Comment

by:Bassam_Basamad
You take look on the following links that explain easyXDM

http://www.codeproject.com/KB/scripting/easyXDM.aspx

Kind Regards,
0

Featured Post

Important Lessons on Recovering from Petya

In their most recent webinar, Skyport Systems explores ways to isolate and protect critical databases to keep the core of your company safe from harm.

Join & Write a Comment

Exchange organizations may use the Journaling Agent of the Transport Service to archive messages going through Exchange. However, if the Transport Service is integrated with some email content management application (such as an anti-spam), the admin…
In a question here at Experts Exchange (https://www.experts-exchange.com/questions/29062564/Adobe-acrobat-reader-DC.html), a member asked how to create a signature in Adobe Acrobat Reader DC (the free Reader product, not the paid, full Acrobat produ…
Suggested Courses
Course of the Month13 days, 11 hours left to enroll

Keep in touch with Experts Exchange

Tech news and trends delivered to your inbox every month