Swift: glDrawElements crashing with EXC_BAD_ACCESS code=1
我正在通过本指南在 iOS 上学习 OpenGL,我想在 swift 上实现所有内容。所以,有一些代码让我崩溃了:
内存结构:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 | private struct Vertex { var Position: (GLfloat, GLfloat, GLfloat) var Color: (GLfloat, GLfloat, GLfloat, GLfloat) } private static var Vertices = [ Vertex(Position: (1, -1, 0) , Color: (1, 0, 0, 1)), Vertex(Position: (1, 1, 0) , Color: (0, 1, 0, 1)), Vertex(Position: (-1, 1, 0) , Color: (0, 0, 1, 1)), Vertex(Position: (-1, -1, 0), Color: (0, 0, 0, 1)) ] private static var Indices: [GLubyte] = [ 0, 1, 2, 2, 3, 0 ] |
创建顶点缓冲区:
1 2 3 4 5 6 7 8 9 | var vertexBuffer = GLuint() glGenBuffers(1, &vertexBuffer) glBindBuffer(GLenum(GL_ARRAY_BUFFER), vertexBuffer) glBufferData(GLenum(GL_ARRAY_BUFFER), Vertices.size, Vertices, GLenum(GL_STATIC_DRAW)) var indexBuffer = GLuint() glGenBuffers(1, &indexBuffer) glBindBuffer(GLenum(GL_ELEMENT_ARRAY_BUFFER), indexBuffer) glBufferData(GLenum(GL_ELEMENT_ARRAY_BUFFER), Indices.size, Indices, GLenum(GL_STATIC_DRAW)) |
设置内存偏移:
1 2 3 4 | var positionPtr = 0 glVertexAttribPointer(GLuint(positionSlot), 3, GLenum(GL_FLOAT), GLboolean(GL_FALSE), GLsizei(strideofValue(Vertex)), &positionPtr) var colorPtr = strideof(GLfloat) * 3 glVertexAttribPointer(GLuint(colorSlot), 4, GLenum(GL_FLOAT), GLboolean(GL_FALSE), GLsizei(strideofValue(Vertex)), &colorPtr) |
崩溃(尝试绘制):
1 2 3 | var startPtr = 0 // EXC_BAD_ACCESS code=1 here! glDrawElements(GLenum(GL_TRIANGLES), GLsizei(Indices.count / 3), GLenum(GL_UNSIGNED_BYTE), &startPtr) |
所有着色器都编译没有任何错误并且
这里我如何计算数组的大小:
1 2 3 4 5 6 | extension Array { var size: Int { get { return self.count * strideof(Element) } } } |
UPD:我正在使用 OpenGLES 2.0。
我在 4 个月前从你的指南中学到了金额。我试图将它从objective-c转换为swift,直到在图片下绘制相同的矩形。
现在我运行它并转换为 Swift 2.1。它仍然有效并在下面显示相同的图像。
这是我的代码(方法 setupVBO、渲染和结构)
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 | // Track of all our per-vertex information (currently just color and position) struct Vertex { var Position: (CFloat, CFloat, CFloat) var Color: (CFloat, CFloat, CFloat, CFloat) } // Array with all the info for each vertex var Vertices = [ Vertex(Position: (1, -1, 0) , Color: (1, 0, 0, 1)), Vertex(Position: (1, 1, 0) , Color: (0, 1, 0, 1)), Vertex(Position: (-1, 1, 0) , Color: (0, 0, 1, 1)), Vertex(Position: (-1, -1, 0), Color: (0, 0, 0, 1)) ] // Array that gives a list of triangles to create, by specifying the 3 vertices that make up each triangle var Indices: [GLubyte] = [ 0, 1, 2, 2, 3, 0 ] //helper extensions to pass arguments to GL land extension Array { func size () -> Int { return self.count * sizeofValue(self[0]) } } //The best way to send data to OpenGL is through something called Vertex Buffer Objects. func setupVBOs() { // VBO : Vertex Buffer Objects. //There are two types of vertex buffer objects – one to keep track of the per-vertex data (like we have in the Vertices array), and one to keep track of the indices that make up triangles (like we have in the Indices array). glGenBuffers(1, &vertexBuffer) glBindBuffer(GLenum(GL_ARRAY_BUFFER), vertexBuffer) glBufferData(GLenum(GL_ARRAY_BUFFER), Vertices.count * sizeofValue(Vertices[0]), Vertices, GLenum(GL_STATIC_DRAW)) // send the data over to OpenGL-land. glGenBuffers(1, &indexBuffer) glBindBuffer(GLenum(GL_ELEMENT_ARRAY_BUFFER), indexBuffer) glBufferData(GLenum(GL_ELEMENT_ARRAY_BUFFER), Indices.count * sizeofValue(Indices[0]), Indices, GLenum(GL_STATIC_DRAW)) } func render() { glClearColor(0, 104.0/255.0, 55.0/255.0, 1.0) glClear(GLbitfield(GL_COLOR_BUFFER_BIT)) //glViewport(0, 0, GLint(frame.size.width), GLint(frame.size.height)) glViewport(0, GLint(frame.size.height/2)/2, GLint(frame.size.width), GLint(frame.size.height/2)) // feed the correct values to the two input variables for the vertex shader – the Position and SourceColor attributes. glVertexAttribPointer(positionSlot, 3, GLenum(GL_FLOAT), GLboolean(UInt8(GL_FALSE)), GLsizei(sizeof(Vertex)), nil) glVertexAttribPointer(colorSlot, 4, GLenum(GL_FLOAT), GLboolean(UInt8(GL_FALSE)), GLsizei(sizeof(Vertex)), UnsafePointer<Int>(bitPattern: sizeof(Float) * 3)) // This actually ends up calling your vertex shader for every vertex you pass in, and then the fragment shader on each pixel to display on the screen. glDrawElements(GLenum(GL_TRIANGLES), GLsizei(Indices.count), GLenum(GL_UNSIGNED_BYTE), nil) _context.presentRenderbuffer(Int(GL_RENDERBUFFER)) } |