英文:
I tried to implement transparency but something went wrong
问题
以下是翻译好的内容:
我想要向我的游戏引擎添加“透明度”,这是我以前没有了解过的。我没有找到关于如何实现它的明确答案,所以我开始研究了一些关于Alpha混合的内容,这是一种正确理解的情况下显示不透明度的一种方式。
我在Google上搜索并尝试找到一个关于在像素数组中如何实现这一点的来源。我一点也不清楚,但除了YouTube上的一个教程之外,我什么都没找到。他们没有解释为什么要这样做,因此我仍然不知道如何实现它或者它是如何工作的。我试着跟着教程走,但他们使用的代码根本不起作用,所以我稍微改变了一下(显然也没起作用)。
以下是我的*setPixel()*函数,它在指定位置设置像素。从函数的开始,它只是检查是否需要放置像素。这个函数用于绘制像素数据中的每个单独像素。屏幕的像素数据存储在变量pixels中。图像数据存储在value中。然而,value只是一个整数,而pixels是一个数组。
public void setPixel(int x, int y, int value, Color invis) {
int alpha = (value >> 24);
if (invis != null && value == invis.getRGB() || alpha == 0x00) {
return;
}
if (!isOutSideScreen(x, y)) {
if (alpha == 255) {
pixels[x + y * pWidth] = value;
} else {
int pixelColor = value;
int newRed = ((pixelColor >> 16) & 0xff) + (int)((((pixelColor >> 16) & 0xff) - ((pixels[x + y * pWidth] >> 16) & 0xff)) * (alpha / 255f));
int newGreen = ((pixelColor >> 8) & 0xff) + (int)((((pixelColor >> 8) & 0xff) - ((pixels[x + y * pWidth] >> 8) & 0xff)) * (alpha / 255f));
int newBlue = (pixelColor & 0xff) + (int)(((pixelColor & 0xff) - (pixels[x + y * pWidth] & 0xff)) * (alpha / 255f));
pixels[x + y * pWidth] = ((255 << 24) | (newRed << 16) | (newGreen << 8) | newBlue);
}
}
}
我不理解这段代码的地方在于所有位操作,以及为什么要这样计算颜色。
int newRed = ((pixelColor >> 16) & 0xff) + (int)((((pixelColor >> 16) & 0xff) - ((pixels[x + y * pWidth] >> 16) & 0xff)) * (alpha / 255f));
int newGreen = ((pixelColor >> 8) & 0xff) + (int)((((pixelColor >> 8) & 0xff) - ((pixels[x + y * pWidth] >> 8) & 0xff)) * (alpha / 255f));
int newBlue = (pixelColor & 0xff) + (int)(((pixelColor & 0xff) - (pixels[x + y * pWidth] & 0xff)) * (alpha / 255f));
pixels[x + y * pWidth] = ((255 << 24) | (newRed << 16) | (newGreen << 8) | newBlue);
如果有人能解释为什么这段代码不起作用以及它实际上是如何工作的,我将永远感激不尽!
提前感谢,为我的无知向您道歉!
在Joni的答案之后编辑
这是我现在使用的代码:
int pixelColor = pixels[x + y * pWidth];
int newRed = (int)((1 - (alpha / 255f)) * ((pixelColor >> 16) & 0xff) + (alpha / 255f) * ((value >> 16) & 0xff));
int newGreen = (int)((1 - (alpha / 255f)) * ((pixelColor >> 8) & 0xff) + (alpha / 255f) * ((value >> 8) & 0xff));
int newBlue = (int)((1 - (alpha / 255f)) * (pixelColor & 0xff) + (alpha / 255f) * (value & 0xff));
pixels[x + y * pWidth] = ((255 << 24) | (newRed << 16) | (newGreen << 8) | newBlue);
我使用了公式:outColor = (1 - alpha) * backgroundColor + alpha * newColor
。
英文:
I wanted to add "transparency" to my Game Engine, which I have no earlier knowledge of. I didn't find any straight answer on how to implement it so I ended up doing some research about something called Alpha Blending which is one way of showing opacity if I have understood it right.
I Googled around and tried to find a source that showed how to implement this when you have a pixel array. I have no clue how but I found nothing except a tutorial on Youtube. They didn't explain why they did like they did and due to that I still have no clue how I can implement it or how it works. I tried to follow the tutorial but the code they used didn't work at all so I changed it a bit (which clearly didn't work).
This code below is my setPixel() function that sets a pixel at a specified location. From the start of the function it just checks if it needs to place a pixel at all. This function is used to draw every individual pixel from the pixel data. The pixel data of the screen is stored in the variable pixels. The image data is stored in value. Value is however just an integer, while pixels is an array..
public void setPixel(int x, int y, int value, Color invis) {
int alpha = (value>>24);
if(invis != null && value == invis.getRGB() || alpha == 0x00) {
return;
}
if(!isOutSideScreen(x,y)) {
if(alpha == 255) {
pixels[x + y * pWidth] = value;
}
else {
int pixelColor = value;
int newRed = ((pixelColor >> 16) & 0xff) + (int)((((pixelColor >> 16) & 0xff) - ((pixels[x + y * pWidth] >> 16) & 0xff)) * (alpha/255f));
int newGreen = ((pixelColor >> 8) & 0xff) + (int)((((pixelColor >> 8) & 0xff) - ((pixels[x + y * pWidth] >> 8) & 0xff)) * (alpha/255f));
int newBlue = (pixelColor & 0xff) + (int)(((pixelColor & 0xff) - (pixels[x + y * pWidth] & 0xff)) * (alpha/255f));
pixels[x+y * pWidth] = ((255 << 24) | (newRed << 16) | (newGreen << 8) | newBlue);
}
}
}
What I dont understand about this code is all the bitwise code and why you calculate the colors like that.
int newRed = ((pixelColor >> 16) & 0xff) + (int)((((pixelColor >> 16) & 0xff) - ((pixels[x + y * pWidth] >> 16) & 0xff)) * (alpha/255f));
int newGreen = ((pixelColor >> 8) & 0xff) + (int)((((pixelColor >> 8) & 0xff) - ((pixels[x + y * pWidth] >> 8) & 0xff)) * (alpha/255f));
int newBlue = (pixelColor & 0xff) + (int)(((pixelColor & 0xff) - (pixels[x + y * pWidth] & 0xff)) * (alpha/255f));
pixels[x+y * pWidth] = ((255 << 24) | (newRed << 16) | (newGreen << 8) | newBlue);
If someone can explain why this code doesn't work and how is actually works I would be forever grateful!
Thanks in advance and sorry for my ignorance!
Edit after Joni's answer
This is the code I now use:
int pixelColor = pixels[x+y * pWidth];
int newRed = (int)((1 - (alpha / 255f)) * ((pixelColor>>16) & 0xff) + (alpha / 255f) * ((value >> 16) & 0xff));
int newGreen = (int)((1 - (alpha / 255f)) * ((pixelColor>>8) & 0xff) + (alpha / 255f) * ((value >> 8) & 0xff));
int newBlue = (int)((1 - (alpha / 255f)) * (pixelColor & 0xff) + (alpha / 255f) * (value & 0xff));
pixels[x+y * pWidth] = ((255 << 24) | (newRed << 16) | (newGreen << 8) | newBlue);
I used the formula: outColor = (1 - alpha) * backgroundColor + alpha * newColor
答案1
得分: 1
"alpha blending" 中的"newColor" 在"backgroundColor" 之上的公式是:
outColor = (1 - alpha) * backgroundColor + alpha * newColor
为了理解它的工作原理,可以尝试不同的 alpha 值。当 alpha=0 时,你得到背景颜色。当 alpha=1 时,你得到 newColor。
你编写的公式是:
outColor = backgroundColor + alpha * (backgroundColor - newColor)
Alpha = 1 时,得到 outColor = 2*backgroundColor-newColor
,这是不正确的。要修正它,你需要交换 pixelColor
和 pixels[x+y*pWidth]
的位置 - 例如对于蓝色通道:
int newBlue = (pixelColor & 0xff) + (int)(((pixels[x + y * pWidth] & 0xff) - (pixelColor & 0xff)) * (alpha/255f));
> 我不理解这段代码的地方是其中的位操作,以及为什么要这样计算颜色。
这段代码假设使用一种颜色模型,将四个8位整数打包成一个 int
。最高的8位组成了 alpha 分量,其次是红色分量、绿色分量和蓝色分量各8位。从 int
中提取分量的方式是使用位运算符。例如,color&0xff
是最低的8位,因此它是蓝色分量。(color>>8)&0xff
给出了次低的8位,即绿色分量。
英文:
The formula for alpha blending "newColor" on top of "backgroundColor" is:
outColor = (1 - alpha) * backgroundColor + alpha * newColor
To see why it works, try different values of alpha. With alpha=0, you get background color. With alpha=1, you get newColor.
The formula you've programmed is
outColor = backgroundColor + alpha * (backgroundColor - newColor)
Alpha = 1 gives you outColor = 2*backgroundColor-newColor
which is incorrect. To fix it you need to swap pixelColor
and pixels[x+y*pWidth]
around - for example for blue channel:
int newBlue = (pixelColor & 0xff) + (int)(((pixels[x + y * pWidth] & 0xff) - (pixelColor & 0xff)) * (alpha/255f));
> What I dont understand about this code is all the bitwise code and why you calculate the colors like that.
This code assumes a color model that packs four 8-bit integers into one int
. The most significant 8 bits make up the alpha component, followed by 8bits for the red color component, green color, and blue color each. The way you extract a component out of an int
is with bit-wise operators. For example, color&0xff
is the lowest 8 bits, therefore it's the blue component. (color>>8)&0xff
gives you the second lowest 8 bits, which is the green component.
答案2
得分: 0
屏幕的像素数据存储在变量 pixels 中。
不确定这是否有帮助,但您可以使用以下方法从像素数组创建 BufferedImage:
import java.awt.*;
import java.awt.image.*;
import javax.swing.*;
public class ImageFromArray3 extends JFrame
{
int width = 50;
int height = 50;
int imageSize = width * height;
public ImageFromArray3()
{
JPanel panel = new JPanel();
getContentPane().add(panel);
int[] pixels = new int[imageSize];
// 创建红色图像
for (int i = 0; i < imageSize; i++)
{
//pixels[i] = 255 << 16; // 没有 alpha
pixels[i] = (64 << 24) + (255 << 16);
}
panel.add(createImageLabel(pixels));
// 创建绿色图像
for (int i = 0; i < imageSize; i++)
{
//pixels[i] = 255 << 8;
pixels[i] = (128 << 24) + (255 << 8);
}
panel.add(createImageLabel(pixels));
// 创建蓝色图像
for (int i = 0; i < imageSize; i++)
{
//pixels[i] = 255;
pixels[i] = (192 << 24) + (255);
}
panel.add(createImageLabel(pixels));
// 创建青色图像
for (int i = 0; i < imageSize; i++)
{
//pixels[i] = (255 << 8) + 255;
pixels[i] = (255 << 24) + (255 << 8) + (255);
}
panel.add(createImageLabel(pixels));
}
private JLabel createImageLabel(int[] pixels)
{
//BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_RRGB);
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_ARGB);
WritableRaster raster = image.getRaster();
raster.setDataElements(0, 0, width, height, pixels);
JLabel label = new JLabel(new ImageIcon(image));
return label;
}
public static void main(String args[])
{
JFrame frame = new ImageFromArray3();
frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
frame.pack();
frame.setLocationRelativeTo(null);
frame.setVisible(true);
}
}
它为每种颜色使用不同的 alpha 值。
现在您已经有了 BufferedImage,因此您应该能够使用 AlphaComposite。
英文:
> The pixel data of the screen is stored in the variable pixels.
Don't know if this helps or not but you can create a BufferedImage from an Array of pixels using something like the following:
import java.awt.*;
import java.awt.image.*;
import javax.swing.*;
public class ImageFromArray3 extends JFrame
{
int width = 50;
int height = 50;
int imageSize = width * height;
public ImageFromArray3()
{
JPanel panel = new JPanel();
getContentPane().add( panel );
int[] pixels = new int[imageSize];
// Create Red Image
for (int i = 0; i < imageSize; i++)
{
//pixels[i] = 255 << 16; // no alpha
pixels[i] = (64 << 24 ) + (255 << 16);
}
panel.add( createImageLabel(pixels) );
// Create Green Image
for (int i = 0; i < imageSize; i++)
{
//pixels[i] = 255 << 8;
pixels[i] = (128 << 24 ) + (255 << 8);
}
panel.add( createImageLabel(pixels) );
// Create Blue Image
for (int i = 0; i < imageSize; i++)
{
//pixels[i] = 255;
pixels[i] = (192 << 24 ) + (255);
}
panel.add( createImageLabel(pixels) );
// Create Cyan Image
for (int i = 0; i < imageSize; i++)
{
//pixels[i] = (255 << 8) + 255;
pixels[i] = (255 << 24 ) + (255 << 8) + (255);
}
panel.add( createImageLabel(pixels) );
}
private JLabel createImageLabel(int[] pixels)
{
//BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_RRGB);
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_ARGB);
WritableRaster raster = image.getRaster();
raster.setDataElements(0, 0, width, height, pixels);
JLabel label = new JLabel( new ImageIcon(image) );
return label;
}
public static void main(String args[])
{
JFrame frame = new ImageFromArray3();
frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
frame.pack();
frame.setLocationRelativeTo( null );
frame.setVisible( true );
}
}
It uses different alpha values for each color.
Now you have your BufferedImage so you should be able to use the AlphaComposite.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论