如何在 @task.external_python 中将 Python 列表用作全局变量?

huangapple go评论98阅读模式
英文:

How to use a python list as global variable python list with in @task.external_python?

问题

以下是您要的代码翻译部分:

  1. # GOAL:
  2. # 1. 作为全局变量在任务之间拥有一个Python列表。
  3. # 2. 当前在第一个任务崩溃。
  4. # 3. 1.) 我试图拥有一个简单的Python列表,它可以从一个任务传递到下一个任务,并在任务2中向其附加一些字符串值。因此,目标是拥有一个共享列表。
  5. # 4. 2.) 即使有一个任务失败,也应该继续进行,不受影响(显然标记任务区域为失败)。
  6. # SETUP:
  7. # 1. 我使用Airflow 2.4.1
  8. # 2. 我使用Airflow Docker构建了一个Python环境,我已经使用多次,而且运行正常。
  9. # MY CODE:
  10. from __future__ import annotations
  11. import logging
  12. import os
  13. import shutil
  14. import sys
  15. import tempfile
  16. import time
  17. from pprint import pprint
  18. import pendulum
  19. from airflow import DAG
  20. from airflow.decorators import task
  21. log = logging.getLogger(__name__)
  22. PYTHON = sys.executable
  23. BASE_DIR = tempfile.gettempdir()
  24. my_default_args = {
  25. 'owner': 'me',
  26. 'email': ['some_email@some_email.com'],
  27. 'email_on_failure': True,
  28. 'email_on_retry': False,
  29. 'write_successes': [],
  30. }
  31. with DAG(
  32. dag_id='my_dag_id',
  33. schedule='9 9 * * *',
  34. start_date=pendulum.datetime(2022, 1, 1, tz="UTC"),
  35. catchup=False,
  36. default_args=my_default_args,
  37. tags=['a', 'b'],
  38. ) as dag:
  39. @task.external_python(task_id="one", python='/opt/airflow/venv1/bin/python3')
  40. def first(**kwargs):
  41. task_id="one"
  42. write_successes = kwargs.get('write_successes', [])
  43. print(write_successes)
  44. write_successes.append(99)
  45. print(write_successes)
  46. @task.external_python(task_id="two", python='/opt/airflow/venv1/bin/python3')
  47. def second(**kwargs):
  48. write_successes = kwargs.get('write_successes', [])
  49. print(write_successes)
  50. write_successes.append(101)
  51. print(write_successes)
  52. one = first()
  53. two = second()
  54. one >> two
  55. # ERROR:
  56. # (您的错误信息部分,已经被保留)

我已经为您翻译了代码部分,如有需要,请随时提出问题。

英文:

GOAL:

  • Have a python list as a global variable between tasks.
  • Currently it crashes at the 1st task.
  • 1.) I am trying to have a simple python list that is carried from 1 task to the next and append a few string values to it at task 2. So the goal is to have 1 shared list.
  • 2.) Even if 1 task fails it should just move on ad dotn care (obviously mark the task area failed)

SETUP:

  • I am on Airflow 2.4.1
  • I use Airflow Docker and build a python environemnt that I have used many times and just works fine.

MY CODE:

  1. from __future__ import annotations
  2. import logging
  3. import os
  4. import shutil
  5. import sys
  6. import tempfile
  7. import time
  8. from pprint import pprint
  9. import pendulum
  10. from airflow import DAG
  11. from airflow.decorators import task
  12. log = logging.getLogger(__name__)
  13. PYTHON = sys.executable
  14. BASE_DIR = tempfile.gettempdir()
  15. my_default_args = {
  16. 'owner': 'me',
  17. 'email': ['some_email@some_email.com'],
  18. 'email_on_failure': True,
  19. 'email_on_retry': False,
  20. 'write_successes': [],
  21. }
  22. with DAG(
  23. dag_id='my_dag_id',
  24. schedule='9 9 * * *',
  25. start_date=pendulum.datetime(2022, 1, 1, tz="UTC"),
  26. catchup=False,
  27. default_args=my_default_args,
  28. tags=['a', 'b'],
  29. ) as dag:
  30. @task.external_python(task_id="one", python='/opt/airflow/venv1/bin/python3')
  31. def first(**kwargs):
  32. task_id="one"
  33. write_successes = kwargs.get('write_successes', [])
  34. print(write_successes)
  35. write_successes.append(99)
  36. print(write_successes)
  37. @task.external_python(task_id="two", python='/opt/airflow/venv1/bin/python3')
  38. def second(**kwargs):
  39. write_successes = kwargs.get('write_successes', [])
  40. print(write_successes)
  41. write_successes.append(101)
  42. print(write_successes)
  43. one = first()
  44. two = second()
  45. one >> two
  1. *** Reading local file: /opt/airflow/logs/dag_id=test_global_variable/run_id=scheduled__2023-02-05T09:09:00+00:00/task_id=one/attempt=1.log
  2. [2023-02-06, 12:24:43 GMT] {taskinstance.py:1165} INFO - Dependencies all met for <TaskInstance: test_global_variable.one scheduled__2023-02-05T09:09:00+00:00 [queued]>
  3. [2023-02-06, 12:24:43 GMT] {taskinstance.py:1165} INFO - Dependencies all met for <TaskInstance: test_global_variable.one scheduled__2023-02-05T09:09:00+00:00 [queued]>
  4. [2023-02-06, 12:24:43 GMT] {taskinstance.py:1362} INFO -
  5. --------------------------------------------------------------------------------
  6. [2023-02-06, 12:24:43 GMT] {taskinstance.py:1363} INFO - Starting attempt 1 of 1
  7. [2023-02-06, 12:24:43 GMT] {taskinstance.py:1364} INFO -
  8. --------------------------------------------------------------------------------
  9. [2023-02-06, 12:24:43 GMT] {taskinstance.py:1383} INFO - Executing <Task(_PythonExternalDecoratedOperator): one> on 2023-02-05 09:09:00+00:00
  10. [2023-02-06, 12:24:43 GMT] {standard_task_runner.py:54} INFO - Started process 239657 to run task
  11. [2023-02-06, 12:24:43 GMT] {standard_task_runner.py:82} INFO - Running: ['airflow', 'tasks', 'run', 'test_global_variable', 'one', 'scheduled__2023-02-05T09:09:00+00:00', '--job-id', '72751', '--raw', '--subdir', 'DAGS_FOLDER/test_global_variable.py', '--cfg-path', '/tmp/tmpxldmrzpp']
  12. [2023-02-06, 12:24:43 GMT] {standard_task_runner.py:83} INFO - Job 72751: Subtask one
  13. [2023-02-06, 12:24:43 GMT] {dagbag.py:525} INFO - Filling up the DagBag from /opt/airflow/dags/test_global_variable.py
  14. [2023-02-06, 12:24:43 GMT] {task_command.py:384} INFO - Running <TaskInstance: test_global_variable.one scheduled__2023-02-05T09:09:00+00:00 [running]> on host 4851b30aa5cf
  15. [2023-02-06, 12:24:43 GMT] {taskinstance.py:1590} INFO - Exporting the following env vars:
  16. AIRFLOW_CTX_DAG_OWNER=me
  17. AIRFLOW_CTX_DAG_ID=test_global_variable
  18. AIRFLOW_CTX_TASK_ID=one
  19. AIRFLOW_CTX_EXECUTION_DATE=2023-02-05T09:09:00+00:00
  20. AIRFLOW_CTX_TRY_NUMBER=1
  21. AIRFLOW_CTX_DAG_RUN_ID=scheduled__2023-02-05T09:09:00+00:00
  22. [2023-02-06, 12:24:44 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/context.py:204: AirflowContextDeprecationWarning: Accessing 'execution_date' from the template is deprecated and will be removed in a future version. Please use 'data_interval_start' or 'logical_date' instead.
  23. warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))
  24. [2023-02-06, 12:24:44 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/context.py:204: AirflowContextDeprecationWarning: Accessing 'next_ds' from the template is deprecated and will be removed in a future version. Please use '{{ data_interval_end | ds }}' instead.
  25. warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))
  26. [2023-02-06, 12:24:44 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/context.py:204: AirflowContextDeprecationWarning: Accessing 'next_ds_nodash' from the template is deprecated and will be removed in a future version. Please use '{{ data_interval_end | ds_nodash }}' instead.
  27. warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))
  28. [2023-02-06, 12:24:44 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/context.py:204: AirflowContextDeprecationWarning: Accessing 'next_execution_date' from the template is deprecated and will be removed in a future version. Please use 'data_interval_end' instead.
  29. warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))
  30. [2023-02-06, 12:24:44 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/context.py:204: AirflowContextDeprecationWarning: Accessing 'prev_ds' from the template is deprecated and will be removed in a future version.
  31. warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))
  32. [2023-02-06, 12:24:44 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/context.py:204: AirflowContextDeprecationWarning: Accessing 'prev_ds_nodash' from the template is deprecated and will be removed in a future version.
  33. warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))
  34. [2023-02-06, 12:24:44 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/context.py:204: AirflowContextDeprecationWarning: Accessing 'prev_execution_date' from the template is deprecated and will be removed in a future version.
  35. warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))
  36. [2023-02-06, 12:24:44 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/context.py:204: AirflowContextDeprecationWarning: Accessing 'prev_execution_date_success' from the template is deprecated and will be removed in a future version. Please use 'prev_data_interval_start_success' instead.
  37. warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))
  38. [2023-02-06, 12:24:44 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/context.py:204: AirflowContextDeprecationWarning: Accessing 'tomorrow_ds' from the template is deprecated and will be removed in a future version.
  39. warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))
  40. [2023-02-06, 12:24:44 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/context.py:204: AirflowContextDeprecationWarning: Accessing 'tomorrow_ds_nodash' from the template is deprecated and will be removed in a future version.
  41. warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))
  42. [2023-02-06, 12:24:44 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/context.py:204: AirflowContextDeprecationWarning: Accessing 'yesterday_ds' from the template is deprecated and will be removed in a future version.
  43. warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))
  44. [2023-02-06, 12:24:44 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/context.py:204: AirflowContextDeprecationWarning: Accessing 'yesterday_ds_nodash' from the template is deprecated and will be removed in a future version.
  45. warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))
  46. [2023-02-06, 12:24:44 GMT] {taskinstance.py:1851} ERROR - Task failed with exception
  47. Traceback (most recent call last):
  48. File "/home/airflow/.local/lib/python3.8/site-packages/airflow/decorators/base.py", line 188, in execute
  49. return_value = super().execute(context)
  50. File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 370, in execute
  51. return super().execute(context=serializable_context)
  52. File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 175, in execute
  53. return_value = self.execute_callable()
  54. File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 678, in execute_callable
  55. return self._execute_python_callable_in_subprocess(python_path, tmp_path)
  56. File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 411, in _execute_python_callable_in_subprocess
  57. self._write_args(input_path)
  58. File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 381, in _write_args
  59. file.write_bytes(self.pickling_library.dumps({'args': self.op_args, 'kwargs': self.op_kwargs}))
  60. _pickle.PicklingError: Can't pickle <function first at 0x7f80ff76e4c0>: it's not the same object as unusual_prefix_6cc7442bed7c02593e3a29524b0e65329d9f59da_test_global_variable.first
  61. [2023-02-06, 12:24:44 GMT] {taskinstance.py:1401} INFO - Marking task as FAILED. dag_id=test_global_variable, task_id=one, execution_date=20230205T090900, start_date=20230206T122443, end_date=20230206T122444
  62. [2023-02-06, 12:24:44 GMT] {standard_task_runner.py:102} ERROR - Failed to execute job 72751 for task one (Can't pickle <function first at 0x7f80ff76e4c0>: it's not the same object as unusual_prefix_6cc7442bed7c02593e3a29524b0e65329d9f59da_test_global_variable.first; 239657)
  63. [2023-02-06, 12:24:44 GMT] {local_task_job.py:164} INFO - Task exited with return code 1
  64. [2023-02-06, 12:24:44 GMT] {local_task_job.py:273} INFO - 0 downstream tasks scheduled from follow-on schedule check

I have tried to fix it based on the following posts:

  1. from airflow.decorators import dag, task
  2. from pendulum import datetime
  3. @dag(
  4. dag_id='test_global_variable',
  5. start_date=datetime(2022,12,10),
  6. schedule=None,
  7. catchup=False,)
  8. def write_var():
  9. @task.external_python(task_id="task_1", python='/opt/airflow/venv1/bin/python3')
  10. def add_to_list(my_list):
  11. print(my_list)
  12. my_list.append(19)
  13. return my_list
  14. @task.external_python(task_id="task_2", python='/opt/airflow/venv1/bin/python3')
  15. def add_to_list_2(my_list):
  16. print(my_list)
  17. my_list.append(42)
  18. return my_list
  19. add_to_list_2(add_to_list([23, 5, 8]))
  20. write_var()

LOG From the succesful task

  1. [2023-02-06, 15:36:52 GMT] {taskinstance.py:1165} INFO - Dependencies all met for <TaskInstance: test_global_variable.task_1 manual__2023-02-06T15:36:51.225176+00:00 [queued]>
  2. [2023-02-06, 15:36:52 GMT] {taskinstance.py:1165} INFO - Dependencies all met for <TaskInstance: test_global_variable.task_1 manual__2023-02-06T15:36:51.225176+00:00 [queued]>
  3. [2023-02-06, 15:36:52 GMT] {taskinstance.py:1362} INFO -
  4. --------------------------------------------------------------------------------
  5. [2023-02-06, 15:36:52 GMT] {taskinstance.py:1363} INFO - Starting attempt 1 of 1
  6. [2023-02-06, 15:36:52 GMT] {taskinstance.py:1364} INFO -
  7. --------------------------------------------------------------------------------
  8. [2023-02-06, 15:36:52 GMT] {taskinstance.py:1383} INFO - Executing <Task(_PythonExternalDecoratedOperator): task_1> on 2023-02-06 15:36:51.225176+00:00
  9. [2023-02-06, 15:36:52 GMT] {standard_task_runner.py:54} INFO - Started process 249785 to run task
  10. [2023-02-06, 15:36:52 GMT] {standard_task_runner.py:82} INFO - Running: ['airflow', 'tasks', 'run', 'test_global_variable', 'task_1', 'manual__2023-02-06T15:36:51.225176+00:00', '--job-id', '72908', '--raw', '--subdir', 'DAGS_FOLDER/test_global_variable.py', '--cfg-path', '/tmp/tmpuw6bfiif']
  11. [2023-02-06, 15:36:52 GMT] {standard_task_runner.py:83} INFO - Job 72908: Subtask task_1
  12. [2023-02-06, 15:36:52 GMT] {dagbag.py:525} INFO - Filling up the DagBag from /opt/airflow/dags/test_global_variable.py
  13. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_1>, task_2 already registered for DAG: test_global_variable
  14. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_2>, task_1 already registered for DAG: test_global_variable
  15. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_1>, task_2 already registered for DAG: test_global_variable
  16. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_2>, task_1 already registered for DAG: test_global_variable
  17. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_1>, task_2 already registered for DAG: test_global_variable
  18. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_2>, task_1 already registered for DAG: test_global_variable
  19. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_1>, task_2 already registered for DAG: test_global_variable
  20. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_2>, task_1 already registered for DAG: test_global_variable
  21. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_1>, task_2 already registered for DAG: test_global_variable
  22. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_2>, task_1 already registered for DAG: test_global_variable
  23. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_1>, task_2 already registered for DAG: test_global_variable
  24. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_2>, task_1 already registered for DAG: test_global_variable
  25. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_1>, task_2 already registered for DAG: test_global_variable
  26. [2023-02-06, 15:36:52 GMT] {taskmixin.py:205} WARNING - Dependency <Task(_PythonExternalDecoratedOperator): task_2>, task_1 already registered for DAG: test_global_variable
  27. [2023-02-06, 15:36:52 GMT] {task_command.py:384} INFO - Running <TaskInstance: test_global_variable.task_1 manual__2023-02-06T15:36:51.225176+00:00 [running]> on host 4851b30aa5cf
  28. [2023-02-06, 15:36:52 GMT] {taskinstance.py:1590} INFO - Exporting the following env vars:
  29. AIRFLOW_CTX_DAG_OWNER=airflow
  30. AIRFLOW_CTX_DAG_ID=test_global_variable
  31. AIRFLOW_CTX_TASK_ID=task_1
  32. AIRFLOW_CTX_EXECUTION_DATE=2023-02-06T15:36:51.225176+00:00
  33. AIRFLOW_CTX_TRY_NUMBER=1
  34. AIRFLOW_CTX_DAG_RUN_ID=manual__2023-02-06T15:36:51.225176+00:00
  35. [2023-02-06, 15:36:53 GMT] {process_utils.py:179} INFO - Executing cmd: /opt/airflow/venv1/bin/python3 /tmp/tmd35abbbcv/script.py /tmp/tmd35abbbcv/script.in /tmp/tmd35abbbcv/script.out /tmp/tmd35abbbcv/string_args.txt
  36. [2023-02-06, 15:36:53 GMT] {process_utils.py:183} INFO - Output:
  37. [2023-02-06, 15:36:54 GMT] {process_utils.py:187} INFO - [23, 5, 8]
  38. [2023-02-06, 15:36:54 GMT] {python.py:177} INFO - Done. Returned value was: [23, 5, 8, 19]
  39. [2023-02-06, 15:36:54 GMT] {taskinstance.py:1401} INFO - Marking task as SUCCESS. dag_id=test_global_variable, task_id=task_1, execution_date=20230206T153651, start_date=20230206T153652, end_date=20230206T153654
  40. [2023-02-06, 15:36:54 GMT] {local_task_job.py:164} INFO - Task exited with return code 0
  41. [2023-02-06, 15:36:54 GMT] {local_task_job.py:273} INFO - 1 downstream tasks scheduled from follow-on schedule check

Screenshot:

如何在 @task.external_python 中将 Python 列表用作全局变量?

答案1

得分: 1

I'm curious what your tried for Airflow XCom? The following DAG passes a list from one task to another using XCom via the TaskFlow API. Tested for Airflow 2.5.1, but it should work the same with 2.4.1.

  1. from airflow.decorators import dag, task
  2. from pendulum import datetime
  3. @dag(
  4. start_date=datetime(2022, 12, 10),
  5. schedule=None,
  6. catchup=False,
  7. )
  8. def write_var():
  9. @task.external_python(
  10. task_id="task_1",
  11. python='/home/astro/.pyenv/versions/my_env/bin/python'
  12. )
  13. def add_to_list(my_list):
  14. print(my_list)
  15. my_list.append(19)
  16. return my_list
  17. @task.external_python(
  18. task_id="task_2",
  19. python='/home/astro/.pyenv/versions/my_env/bin/python'
  20. )
  21. def add_to_list_2(my_list):
  22. print(my_list)
  23. my_list.append(42)
  24. return my_list
  25. add_to_list_2(add_to_list([23, 5, 8]))
  26. write_var()

Screenshot:

如何在 @task.external_python 中将 Python 列表用作全局变量?

英文:

I'm curious what your tried for Airflow XCom? The following DAG passes a list from one task to another using XCom via the TaskFlow API. Tested for Airflow 2.5.1, but it should work the same with 2.4.1.

  1. from airflow.decorators import dag, task
  2. from pendulum import datetime
  3. @dag(
  4. start_date=datetime(2022,12,10),
  5. schedule=None,
  6. catchup=False,
  7. )
  8. def write_var():
  9. @task.external_python(
  10. task_id="task_1",
  11. python='/home/astro/.pyenv/versions/my_env/bin/python'
  12. )
  13. def add_to_list(my_list):
  14. print(my_list)
  15. my_list.append(19)
  16. return my_list
  17. @task.external_python(
  18. task_id="task_2",
  19. python='/home/astro/.pyenv/versions/my_env/bin/python'
  20. )
  21. def add_to_list_2(my_list):
  22. print(my_list)
  23. my_list.append(42)
  24. return my_list
  25. add_to_list_2(add_to_list([23, 5, 8]))
  26. write_var()

Screenshot:

如何在 @task.external_python 中将 Python 列表用作全局变量?

huangapple
  • 本文由 发表于 2023年2月6日 20:39:18
  • 转载请务必保留本文链接:https://go.coder-hub.com/75361423.html
匿名

发表评论

匿名网友
#

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定