I have an ASP.NET application sending data through AJAX to a handler withing my application. This works as it should when debugging locally, but as soon as I deploy the solution to the server, the handler only receives an empty string. I tried fiddling around with contentType and dataType, but without luck.
Here is my code so far.
aspx of the sending page, while "myData" is a simple string:
$.ajax({
type: "POST",
url: "handlers/changeRiskGroup.ashx",
data: myData,
// tried all those content/dataTypes without any luck
//contentType: "text/plain",
//dataType: "text",
//contentType: "application/json; charset=utf-8",
//dataType: "json",
error: function (xhr, status, error) {
console.log(xhr.responseText);
},
success: function (msg) {
console.log(msg);
}
});
.ashx.cs of the receiving handler:
public void ProcessRequest(HttpContext context) {
//string data = new StreamReader(context.Request.InputStream).ReadToEnd();
var data = String.Empty;
context.Request.InputStream.Position = 0;
using(var inputStream = new StreamReader(context.Request.InputStream)) {
data = inputStream.ReadToEnd();
}
if (data != "") {
// doing something with my data here.
// this is never reached while on the server, but works fine locally!
} else {
context.Response.Write("Please supply data to the risk group service!");
}
}
public bool IsReusable {
get {
return false;
}
}
}
The data variable in the .ashx.cs file is filled when debugging locally, but always "" on the server. I have no clue why.
myData?