Dataflow/GCE服务帐户可以访问视图,但不能访问其基础数据集(这不应该是问题)。at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.executeWithRetries(BigQueryServicesImpl.java) a
{gcp_dataflow_hook.py:108} INFO - Start waiting for DataFlow process to complete.
[2017-09-12 16:59:38,225"/usr/lib/python2.7/site-packages/airflow/contrib/hooks/gcp_dataflow_hook.py", line 146, in start_j
我注意到在升级之后,数据流的GET API在URL中使用location而不是job id GET https://dataflow.googleapis.com/v1b3/projects/umg-dealt=json 理想情况下,URL应如下所示 GET https://dataflow.googleapis.com/v1b3/projects/umg-de/locations/us-central1
我在google云平台(GCP)中运行我的Google数据流工作。当我在本地运行这个作业时,它运行得很好,但是在GCP上运行时,我得到了一个错误"java.lang.IllegalArgumentException:找不到方案gs的文件系统“。我在GCP的工作证明:2019-08-09_16_41_15-11728697820819900062(BigQueryHelpers.java:689)
at org.apa
line 184, in execut File "/usr/local/lib/airflow/airflow/contrib/hooks/gcp_dataflow_hook.py/contrib/hooks/gcp_api_base_hook.py", line 286, in wrappe
return func(self, *args,
(MutationGroupEncoder.java:271) at org.apache.beam.sdk.io.gcp.spanner.MutationGroupEncoder.decodePrimitive(MutationGroupEncoder.java:326)
at org.apache.beam.sdk.io.
[2018-07-05 18:24:39,928] {gcp_dataflow_hook.py:108} INFO - Start waiting for DataFlow process to completebase_task_runner.py:95} INFO - Subtask: File "/usr/local/lib/python2.7/dist-packages/airflow/contrib/hooks/gcp</e