在使用 Chaquopy 在 Android Studio 上运行 Python 脚本时无法打开相机。

huangapple go评论141阅读模式
英文:

Running python script on android studio using Chaquopy can't open Camera

问题

My python script uses camera to detect fingers and runs successfully on PyCharm but when trying to run it on android studio using chaquopy it gives error camera is not defined.

Java Code:

  1. public class MainActivity extends PythonConsoleActivity {
  2. @Override protected Class<?> getTaskClass() {
  3. return Task.class;
  4. }
  5. public static class Task extends PythonConsoleActivity.Task {
  6. public Task(Application app) {
  7. super(app);
  8. }
  9. @Override public void run() {
  10. py.getModule("enders_keyboard_vision").callAttr("test");
  11. }
  12. }
  13. }

Python Code:

  1. import cv2 as cv
  2. import numpy as np
  3. import imutils
  4. import time
  5. import math
  6. import enders_keyboard
  7. bg = None
  8. def test():
  9. print("Yarraaabb")
  10. # Rest of your Python code...

Error:

  1. com.chaquo.python.PyException: NameError: name 'camera' is not defined
  2. at <python>.enders_keyboard_vision.<module>(enders_keyboard_vision.py:262)
  3. at <python>.importlib._bootstrap._call_with_frames_removed(<frozen importlib._bootstrap>:219)
  4. at <python>.importlib._bootstrap_external.exec_module(<frozen importlib._bootstrap_external>:783)
  5. at <python>.importlib._bootstrap._load_unlocked(<frozen importlib._bootstrap>:671)
  6. at <python>.importlib._bootstrap._find_and_load_unlocked(<frozen importlib._bootstrap>:975)
  7. at <python>.importlib._bootstrap._find_and_load(<frozen importlib._bootstrap>:991)
  8. at <python>.importlib._bootstrap._gcd_import(<frozen importlib._bootstrap>:1014)
  9. at <python>.importlib.import_module(__init__.py:127)
  10. at <python>.chaquopy_java.Java_com_chaquo_python_Python_getModule(chaquopy_java.pyx:153)
  11. at com.chaquo.python.Python.getModule(Native Method)
  12. at com.chaquo.python.console.MainActivity$Task.run(MainActivity.java:21)
  13. at com.chaquo.python.utils.ConsoleActivity$Task$1.run(ConsoleActivity.java:359)

(Note: The Python code snippet provided here is a partial representation. Please make sure to include the complete Python code in your project.)

英文:

My python script uses camera to detect fingers and runs successfully on PyCharm but when trying to run it on android studio using chaquopy it gives error camera is not defined.
I am new with chaquopy and cant find similar problems with other people or similar answers.

Java Code :

  1. public class MainActivity extends PythonConsoleActivity {
  2. @Override protected Class&lt;? extends Task&gt; getTaskClass() {
  3. return Task.class;
  4. }
  5. public static class Task extends PythonConsoleActivity.Task {
  6. public Task(Application app) {
  7. super(app);
  8. }
  9. @Override public void run() {
  10. py.getModule(&quot;enders_keyboard_vision&quot;).callAttr(&quot;test&quot;);
  11. }
  12. }
  13. }

Python Code:

  1. import cv2 as cv
  2. import numpy as np
  3. import imutils
  4. import time
  5. import math
  6. import enders_keyboard
  7. &quot;&quot;&quot;import enders_keyboard_output&quot;&quot;&quot;
  8. bg = None
  9. def test():
  10. print(&quot;Yarraaabb&quot;)
  11. def run_avg(image, aWeight):
  12. global bg
  13. if bg is None:
  14. bg = image.copy().astype(&quot;float&quot;)
  15. return
  16. cv.accumulateWeighted(image, bg, aWeight)
  17. def segment(image, threshold = 10):
  18. kernel = np.ones((5,5),np.uint8)
  19. global bg
  20. diff = cv.absdiff(bg.astype(&quot;uint8&quot;), image)
  21. thresholded = cv.threshold(diff, threshold, 255, cv.THRESH_BINARY)[1]
  22. cv.GaussianBlur(thresholded, (11, 11), 0)
  23. thresholded = cv.dilate(thresholded, kernel, 10)
  24. thresh = cv.threshold(thresholded, 100, 255, cv.THRESH_BINARY)[1]
  25. thresh = cv.erode(thresh, kernel, 20)
  26. conts,hierarchy = cv.findContours(thresholded.copy(), cv.RETR_EXTERNAL, cv.CHAIN_APPROX_SIMPLE)
  27. if len(conts) == 0:
  28. return
  29. else:
  30. segmented = max(conts, key=cv.contourArea)
  31. return (thresh, segmented)
  32. def rough_hull(hull_ids, cont, max_dist):
  33. if len(hull_ids) &gt; 0 and len(cont) &gt; 0:
  34. res = []
  35. current_pos = cont[hull_ids[0]][0][0]
  36. points = []
  37. for point in hull_ids:
  38. dist = np.linalg.norm(cont[point][0][0] - current_pos)
  39. if dist &gt; max_dist:
  40. res.append(point)
  41. current_pos = cont[point][0][0]
  42. return res
  43. else:
  44. return []
  45. def get_mouse_pos(event, x, y, flags, param):
  46. if event == cv.EVENT_LBUTTONDOWN:
  47. print(x, y)
  48. def vector_proj(v1, v2):
  49. return np.multiply((np.dot(v1, v2) / np.dot(v1, v1)), v1)
  50. def simulate_key(key):
  51. &quot;&quot;&quot;enders_keyboard_output.type_key(key)&quot;&quot;&quot;
  52. print(&quot;ESHTAA&quot;)
  53. if __name__ == &quot;__main__&quot;:
  54. aWeight = 0.5
  55. key_dict = {
  56. 0 : &quot;a&quot;,
  57. 1 : &quot;b&quot;,
  58. 2 : &quot;c&quot;,
  59. 3 : &quot;d&quot;,
  60. 4 : &quot;e&quot;,
  61. 5 : &quot;f&quot;,
  62. 6 : &quot;g&quot;,
  63. 7 : &quot;h&quot;,
  64. 8 : &quot;i&quot;,
  65. 9 : &quot;j&quot;,
  66. 10 : &quot;k&quot;,
  67. 11 : &quot;l&quot;,
  68. 12 : &quot;m&quot;,
  69. 13 : &quot;n&quot;,
  70. 14 : &quot;o&quot;,
  71. 15 : &quot;p&quot;,
  72. 16 : &quot;q&quot;,
  73. 17 : &quot;r&quot;,
  74. 18 : &quot;s&quot;,
  75. 19 : &quot;t&quot;,
  76. 20 : &quot;u&quot;,
  77. 21 : &quot;v&quot;,
  78. 22 : &quot;w&quot;,
  79. 23 : &quot;x&quot;,
  80. 24 : &quot;y&quot;,
  81. 25 : &quot;z&quot;
  82. }
  83. camera = cv.VideoCapture(0)
  84. time.sleep(1)
  85. top, right, bottom, left = 10, 350, 350, 750
  86. num_fingers = 0
  87. num_frames = 0
  88. start_points = [
  89. (385, 235), #thumb
  90. (425, 125), #index
  91. (500, 105), #middle
  92. (560, 130), #ring
  93. (615, 210) #pinky
  94. ]
  95. start_center = (0, 0)
  96. current_points = start_points.copy()
  97. act = False
  98. last_found = [True, True, True, True, True]
  99. while(True):
  100. (grabbed, frame) = camera.read()
  101. frame = imutils.resize(frame, width = 700)
  102. frame = cv.flip(frame, 1)
  103. (height, width) = frame.shape[:2]
  104. roi = frame[top:bottom, right:left]
  105. gray = cv.cvtColor(roi, cv.COLOR_BGR2GRAY)
  106. gray = cv.GaussianBlur(gray, (5, 5), 0)
  107. inner = [False, False, False, False, False]
  108. outer = [False, False, False, False, False]
  109. if num_frames &lt; 10:
  110. run_avg(gray, aWeight)
  111. cv.circle(frame, (int(height / 2), int(width / 2)), 30, (0, 0, 255))
  112. else:
  113. cv.imshow(&quot;background&quot;, bg/255)
  114. hand = segment(gray)
  115. if hand is not None:
  116. (thresholded, segmented) = hand
  117. #if cv.countNonZero(thresholded) &gt; ((top - bottom) * (left - right) * 0.95):
  118. # time.sleep(0.5)
  119. # bg = None
  120. # num_frames = 0
  121. #cv.drawContours(frame, [segmented + (right, top)], -1, (0, 0, 255))
  122. convex_hull = cv.convexHull(segmented + (right, top), returnPoints = False)
  123. hull = rough_hull(convex_hull, segmented, 40)
  124. #remove bottom two points
  125. #del hull[hull[0][:, :, 1].argmin()[0]]
  126. #del hull[hull[0][:, :, 1].argmin()[1]]
  127. if len(segmented) &gt; 0:
  128. hull_sorted = sorted(hull, key = lambda a : segmented[a[0]][0][1])
  129. hull_sorted = hull_sorted[:min(len(hull_sorted), 5)]
  130. activated = []
  131. if act is False:
  132. for point in range(5):
  133. activated.append(False)
  134. for pt in hull_sorted:
  135. if math.hypot(start_points[point][0] - segmented[pt][0][0][0] - right, start_points[point][1] - segmented[pt][0][0][1]) &lt; 25:
  136. activated[point] = True
  137. cv.circle(frame, (start_points[point][0], start_points[point][1]), 30, (255, 0, 0), thickness = -1 if activated[point] else 1)
  138. num_fingers = 0
  139. for active in activated:
  140. if active is True:
  141. num_fingers += 1
  142. if num_fingers &gt;= 5:
  143. print(&quot;act&quot;)
  144. act = True
  145. m = cv.moments(segmented)
  146. start_center = (int(m[&quot;m10&quot;] / m[&quot;m00&quot;]) + right, int(m[&quot;m01&quot;] / m[&quot;m00&quot;]) + top)
  147. else:
  148. m = cv.moments(segmented)
  149. current_center = (int(m[&quot;m10&quot;] / m[&quot;m00&quot;]) + right, int(m[&quot;m01&quot;] / m[&quot;m00&quot;]) + top)
  150. center_diff = np.subtract(current_center, start_center)
  151. for point in range(len(start_points)):
  152. start_points[point] = np.add(start_points[point], center_diff)
  153. start_center = current_center
  154. thumb_inner = False
  155. thumb_outer = False
  156. found = [False, False, False, False, False]
  157. for point in range(5):
  158. vect = [start_points[point][0] - current_center[0], start_points[point][1] - current_center[1]]
  159. mag = math.sqrt(vect[0]**2 + vect[1]**2)
  160. inner[point] = False
  161. outer[point] = False
  162. for pt in hull_sorted:
  163. if math.hypot(current_points[point][0] - segmented[pt][0][0][0] - right, current_points[point][1] - segmented[pt][0][0][1]) &lt; 20 and math.hypot(start_points[point][0] - segmented[pt][0][0][0] - right, start_points[point][1] - segmented[pt][0][0][1]) &lt; 40:
  164. current_points[point] = (segmented[pt][0][0][0] + right, segmented[pt][0][0][1])
  165. diff = np.subtract((current_points[point][0], current_points[point][1]), start_points[point])
  166. adjusted_pt = np.add(start_points[point], vector_proj(vect, diff))
  167. cv.circle(frame, (int(adjusted_pt[0]), int(adjusted_pt[1])), 30, (255, 0, 0), thickness = -1)
  168. found[point] = True
  169. if (not found[point]) and found[point] is not last_found[point]:
  170. d = math.hypot(current_points[point][0] - current_center[0], current_points[point][1] - current_center[1])
  171. current_points[point] = start_points[point]
  172. if d &lt; mag:
  173. inner[point] = True
  174. outer[point] = False
  175. else:
  176. inner[point] = False
  177. outer[point] = True
  178. cv.circle(frame, (current_points[point][0], current_points[point][1]), 25, (255, 0, 255))
  179. last_found[point] = found[point]
  180. cv.circle(frame, (start_points[point][0], start_points[point][1]), 30, (0, 255, 0), thickness = 1)
  181. cv.line(frame, (start_points[point][0], start_points[point][1]), (int(start_points[point][0] + vect[0] * 15 / mag), int(start_points[point][1] + vect[1] * 15 / mag)), (0, 255, 0), thickness = 1)
  182. cv.circle(frame, current_center, 25, (0, 0, 255))
  183. thumb_dist = math.hypot(current_points[0][0] - current_center[0], current_points[0][1] - current_center[1])
  184. thumb_vect = [start_points[0][0] - start_center[0], start_points[0][1] - start_center[1]]
  185. thumb_mag = math.sqrt(vect[0]**2 + vect[1]**2)
  186. if thumb_dist - thumb_mag &lt; -15:
  187. thumb_inner = True
  188. thumb_outer = False
  189. elif thumb_dist - thumb_mag &gt; 15:
  190. thumb_outer = True
  191. thumb_inner = False
  192. for finger in range(len(inner)):
  193. if inner[finger] is True:
  194. simulate_key(key_dict[(8 if thumb_inner else (16 if thumb_outer else 0)) + finger * 2])
  195. elif outer[finger] is True:
  196. simulate_key(key_dict[(8 if thumb_inner else (16 if thumb_outer else 0)) + finger * 2 + 1])
  197. #cv.drawContours(frame, [cv.convexHull(segmented + (right, top), segmented, 5)], -1, (0, 0, 255))
  198. cv.imshow(&quot;Thresholded&quot;, thresholded)
  199. cv.rectangle(frame, (left, top), (right, bottom), (0, 255, 0), 2)
  200. num_frames += 1
  201. cv.setMouseCallback(&quot;Video Feed&quot;, get_mouse_pos)
  202. cv.imshow(&quot;Video Feed&quot;, frame)
  203. keypress = cv.waitKey(10) &amp; 0xFF
  204. if keypress == ord(&quot;q&quot;):
  205. break
  206. if keypress == ord(&quot;r&quot;):
  207. num_frames = 0
  208. start_points = [
  209. (385, 235), #thumb
  210. (425, 125), #index
  211. (500, 105), #middle
  212. (560, 130), #ring
  213. (615, 210) #pinky
  214. ]
  215. act = False
  216. bg = None
  217. time.sleep(0.1)
  218. if keypress == ord(&quot;s&quot;):
  219. enders_keyboard.search(num_fingers)
  220. camera.release()
  221. cv.destroyAllWindows()

Error :

com.chaquo.python.PyException: NameError: name 'camera' is not defined
at <python>.enders_keyboard_vision.<module>(enders_keyboard_vision.py:262)
at <python>.importlib._bootstrap._call_with_frames_removed(<frozen importlib._bootstrap>:219)
at <python>.importlib._bootstrap_external.exec_module(<frozen importlib._bootstrap_external>:783)
at <python>.importlib._bootstrap._load_unlocked(<frozen importlib._bootstrap>:671)
at <python>.importlib._bootstrap._find_and_load_unlocked(<frozen importlib._bootstrap>:975)
at <python>.importlib._bootstrap._find_and_load(<frozen importlib._bootstrap>:991)
at <python>.importlib._bootstrap._gcd_import(<frozen importlib._bootstrap>:1014)
at <python>.importlib.import_module(init.py:127)
at <python>.chaquopy_java.Java_com_chaquo_python_Python_getModule(chaquopy_java.pyx:153)
at com.chaquo.python.Python.getModule(Native Method)
at com.chaquo.python.console.MainActivity$Task.run(MainActivity.java:21)
at com.chaquo.python.utils.ConsoleActivity$Task$1.run(ConsoleActivity.java:359)

答案1

得分: 1

你的模块未被加载为__main__,因此那整段代码将不会被运行。请将其放入一个函数中,然后使用callAttr调用该函数。

另外,Chaquopy版本的OpenCV目前不支持直接访问相机。最简单的解决方法是使用Java捕获图像,将其保存到临时文件,然后在Python中加载该文件。

英文:

Your module is not being loaded as __main__, so that entire block of code will not be run. Instead of putting it in a __main__ block, put it in a function and call it using callAttr.

Separately, the Chaquopy build of OpenCV doesn't currently support accessing the camera directly. The easiest workaround is to capture the image using Java, save it to a temporary file, and load that file in Python.

答案2

得分: 0

前往您的 Android 项目配置中添加明确的摄像头使用权限。
通过 chaqoupy 通道访问的所有外部资源(摄像头、麦克风、Internet 请求、GPS 请求)需要明确的权限,否则将返回未定义/未找到的错误。

英文:

Go and add explicit permission to use camera in your android project config.
All external resources (camera, mic, internet request, gps request) accessed via chaqoupy channel need explicit permission, otherwise it'll return not defined / not found errors.

huangapple
  • 本文由 发表于 2020年4月6日 02:11:57
  • 转载请务必保留本文链接:https://go.coder-hub.com/61047244.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定