英文:
Error Handling Azure Content Moderation API
问题
I'm trying to get create an AuditLog for my moderated content and would like to add the Exception Message sent by the Azure Content Moderation API to the log. Can anyone assist me with regards to saving this to a variable?
要创建一个用于审核内容的AuditLog,并且想要将Azure内容审核API发送的异常消息添加到日志中。有人能帮助我将其保存到变量中吗?
To give you context to how it will work:
为了让您了解它的工作方式:
if (isProfanity)
{
throw new RemoteException("The username contains profanity. Please pick a different one.",
"The username contains profanity and has been denied.",
RemoteExceptionType.ProfileUsernameProfanityException);
}
else if (warning)
{
var log = new ProfanityAuditLog
{
Username = text,
DetectedLanguage = detectedLanguage.DetectedLanguageProperty,
IsProfanity = isProfanity,
ThresholdCategory1 = _options.ThresholdCategory1,
ThresholdCategory2 = _options.ThresholdCategory2,
ThresholdCategory3 = _options.ThresholdCategory3,
//ExceptionMessage = exception
};
_profanityAuditRepo.AddLogAsync(log);
throw new RemoteException("The username might contain profanity.",
"The username might contain profanity and will have to be validated by support.",
RemoteExceptionType.ProfileUsernameProfanityWarningException);
}
如果单词是亵渎性的,应该抛出远程异常并将单词视为亵渎性,如果单词在0.5和0.7的比例之间,单词应该被分类为警告,并添加到用户验证的AuditLog中,并且应该抛出远程异常以通知用户。问题在于我在日志中提到的"Exception Message"需要是我从Azure收到的异常,这是因为问题可能与Azure有关,而不是与单词有关,例如API服务宕机。
This is what I got from some forums and Googling but it's not exactly what I am looking for.
这是我从一些论坛和谷歌搜索中得到的,但并不完全符合我的要求。
string exceptionMessage = screen?.Classification.ReviewRecommended ? "Azure review recommended" : "Custom warning message"
<details>
<summary>英文:</summary>
I'm trying to get create an AuditLog for my moderated content and would like to add the Exception Message sent by the Azure Content Moderation API to the log. Can anyone assist me with regards to saving this to a variable?
To give you context to how it will work:
if (isProfanity)
{
throw new RemoteException("The username contains profanity. Please pick a different one.",
"The username contains profanity and has been denied.",
RemoteExceptionType.ProfileUsernameProfanityException);
}
else if (warning)
{
var log = new ProfanityAuditLog
{
Username = text,
DetectedLanguage = detectedLanguage.DetectedLanguageProperty,
IsProfanity = isProfanity,
ThresholdCategory1 = _options.ThresholdCategory1,
ThresholdCategory2 = _options.ThresholdCategory2,
ThresholdCategory3 = _options.ThresholdCategory3,
//ExceptionMessage = exception
};
_profanityAuditRepo.AddLogAsync(log);
throw new RemoteException("The username might contain profanity.",
"The username might contain profanity and will have to be validated by support.",
RemoteExceptionType.ProfileUsernameProfanityWarningException);
}
If the word is profanity the remote exception should be thrown and the word should be considered profanity, if the word falls between the ratio of `0.5 `& `0.7 `the word should be categorized as a warning and be added to the `AuditLog `for user validation and a emote exception should be thrown to tell the user. The issue is the Exception Message I am referring to in the `log ` needs to be the exception I receive from `Azure`, this is in case the issue is related to `Azure` and not the word e.g. API service is down.
This is what I got from some forums and Googling but it's not exactly what I am looking for.
string exceptionMessage = screen?.Classification.ReviewRecommended? "Azure review recommended": "Custom warning message"
</details>
# 答案1
**得分**: 0
- 使用条件表达式生成`exceptionMessage`。这不会直接捕获Azure内容审查API引发的异常。
- 捕获API引发的异常,提取错误消息,然后将其包含在日志中。
在这里,我在控制台应用程序中创建一个示例数据模型,用于使用Azure内容审查API捕获审查内容的异常消息。
- **ProfanityAuditLog**,这将保存您想要记录的信息。
```c#
using System;
using System.IO;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.CognitiveServices.ContentModerator;
using Microsoft.Azure.CognitiveServices.ContentModerator.Models;
namespace ContentModerationApp
{
internal class Program
{
private static async Task Main(string[] args)
{
Console.WriteLine("输入要检查的淫秽语或文本:");
string userInput = Console.ReadLine();
try
{
await CheckContentAsync(userInput);
}
catch (Exception ex)
{
Console.WriteLine($"发生错误:{ex.Message}");
SaveToAuditLog(userInput, ex.Message);
}
}
public static async Task CheckContentAsync(string text)
{
// 用您的Azure内容审查API密钥和终结点替换
string apiKey = "************";
string endpoint = "https://******.cognitiveservices.azure.com/";
var client = new ContentModeratorClient(new ApiKeyServiceClientCredentials(apiKey)) { Endpoint = endpoint };
try
{
var screen = await client.TextModeration.ScreenTextAsync("text/plain", new MemoryStream(Encoding.UTF8.GetBytes(text)));
string exceptionMessage = "未发生异常"; // 初始化默认值
if (screen != null && screen.Classification != null && screen.Classification.ReviewRecommended == true)
{
if (screen.Classification.Category1?.Score > 0.7)
{
exceptionMessage = "用户名可能包含淫秽语,需要支持验证。";
throw new Exception(exceptionMessage);
}
else if (screen.Classification.Category1?.Score > 0.5)
{
// 处理用户输入在0.5和0.7之间的情况(自定义逻辑)
}
}
// ...(其他代码)
}
catch (Exception ex)
{
Console.WriteLine($"内容审查期间发生错误:{ex.Message}");
throw;
}
}
public static void SaveToAuditLog(string userInput, string exceptionMessage)
{
string logFilePath = "C:\\Users\\v-chikkams\\Downloads\\test.txt";
using (StreamWriter writer = File.AppendText(logFilePath))
{
writer.WriteLine($"用户输入:{userInput}");
writer.WriteLine($"异常消息:{exceptionMessage}");
writer.WriteLine($"时间戳:{DateTime.Now}");
writer.WriteLine(new string('-', 40));
}
Console.WriteLine("已保存到审计日志。");
}
}
}
- 以上应用程序将根据给定的用户名或文本重现内容审查并记录。
当捕获到RemoteException
时,它将保存相关信息到审计日志文件。然后,您可以部署到应用服务。
英文:
You are using a conditional expression to generate the exceptionMessage
. This won't directly capture the exception thrown by the Azure Content Moderation API.
- Catch the exception thrown by the API, extract the error message, and then include it in your log.
Here, I create a sample Data model in console App to capture exception messages for moderated content using the Azure Content Moderation API.
- ProfanityAuditLog, this will hold the information you want to log.
using System;
using System.IO;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.CognitiveServices.ContentModerator;
using Microsoft.Azure.CognitiveServices.ContentModerator.Models;
namespace ContentModerationApp
{
internal class Program
{
private static async Task Main(string[] args)
{
Console.WriteLine("Enter a username or text to check for profanity:");
string userInput = Console.ReadLine();
try
{
await CheckContentAsync(userInput);
}
catch (Exception ex)
{
Console.WriteLine($"An error occurred: {ex.Message}");
SaveToAuditLog(userInput, ex.Message);
}
}
public static async Task CheckContentAsync(string text)
{
// Replace with your Azure Content Moderator API key and endpoint
string apiKey = "************";
string endpoint = "https://******.cognitiveservices.azure.com/";
var client = new ContentModeratorClient(new ApiKeyServiceClientCredentials(apiKey)) { Endpoint = endpoint };
try
{
var screen = await client.TextModeration.ScreenTextAsync("text/plain", new MemoryStream(Encoding.UTF8.GetBytes(text)));
string exceptionMessage = "No exception occurred"; // Initialize a default value
if (screen != null && screen.Classification != null && screen.Classification.ReviewRecommended == true)
{
if (screen.Classification.Category1?.Score > 0.7)
{
exceptionMessage = "The username might contain profanity and will have to be validated by support.";
throw new Exception(exceptionMessage);
}
else if (screen.Classification.Category1?.Score > 0.5)
{
// Handle the case where user input falls between 0.5 and 0.7 (custom logic)
}
}
// ... (other code)
}
catch (Exception ex)
{
Console.WriteLine($"An error occurred during content moderation: {ex.Message}");
throw;
}
}
public static void SaveToAuditLog(string userInput, string exceptionMessage)
{
string logFilePath = "C:\\Users\\v-chikkams\\Downloads\\test.txt";
using (StreamWriter writer = File.AppendText(logFilePath))
{
writer.WriteLine($"User Input: {userInput}");
writer.WriteLine($"Exception Message: {exceptionMessage}");
writer.WriteLine($"Timestamp: {DateTime.Now}");
writer.WriteLine(new string('-', 40));
}
Console.WriteLine("Saved to audit log.");
}
}
}
- The above application will reproduce the content moderation and log based on given username or text to be checked for profanity.
when a RemoteException
is caught, it will save the relevant information to the audit log file. Then you can deploy to app service.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论