Skip to content

Commit aca754d

Browse files
Merge pull request #1273 from cloudsufi/bigquery_e2e_tests
BQ e2e updated tests
2 parents b87c318 + 23c9479 commit aca754d

File tree

8 files changed

+537
-11
lines changed

8 files changed

+537
-11
lines changed
Lines changed: 197 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,197 @@
1+
# Copyright © 2023 Cask Data, Inc.
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
4+
# use this file except in compliance with the License. You may obtain a copy of
5+
# the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
11+
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
12+
# License for the specific language governing permissions and limitations under
13+
# the License.
14+
15+
@BigQuery_Sink
16+
Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data transfer
17+
18+
@BQ_SOURCE_DATATYPE_TEST @BQ_SINK_TEST
19+
Scenario:Validate successful records transfer from BigQuery to BigQuery with partition type TIME with Partition field and require partitioned filter true
20+
Given Open Datafusion Project to configure pipeline
21+
When Expand Plugin group in the LHS plugins list: "Source"
22+
When Select plugin: "BigQuery" from the plugins list as: "Source"
23+
When Expand Plugin group in the LHS plugins list: "Sink"
24+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
25+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
26+
Then Navigate to the properties page of plugin: "BigQuery"
27+
And Enter input plugin property: "referenceName" with value: "Reference"
28+
And Replace input plugin property: "project" with value: "projectId"
29+
And Enter input plugin property: "datasetProject" with value: "projectId"
30+
And Replace input plugin property: "dataset" with value: "dataset"
31+
Then Override Service account details if set in environment variables
32+
And Enter input plugin property: "table" with value: "bqSourceTable"
33+
Then Click on the Get Schema button
34+
Then Validate "BigQuery" plugin properties
35+
And Close the Plugin Properties page
36+
Then Navigate to the properties page of plugin: "BigQuery2"
37+
Then Replace input plugin property: "project" with value: "projectId"
38+
Then Override Service account details if set in environment variables
39+
Then Enter input plugin property: "datasetProject" with value: "projectId"
40+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
41+
Then Enter input plugin property: "dataset" with value: "dataset"
42+
Then Enter input plugin property: "table" with value: "bqTargetTable"
43+
Then Click plugin property: "truncateTable"
44+
Then Click plugin property: "updateTableSchema"
45+
Then Enter BigQuery sink property partition field "bqPartitionFieldTime"
46+
Then Validate "BigQuery" plugin properties
47+
Then Close the BigQuery properties
48+
Then Save the pipeline
49+
Then Preview and run the pipeline
50+
Then Wait till pipeline preview is in running state
51+
Then Open and capture pipeline preview logs
52+
Then Verify the preview run status of pipeline in the logs is "succeeded"
53+
Then Close the pipeline logs
54+
Then Close the preview
55+
Then Deploy the pipeline
56+
Then Run the Pipeline in Runtime
57+
Then Wait till pipeline is in running state
58+
Then Open and capture logs
59+
Then Verify the pipeline status is "Succeeded"
60+
Then Verify the partition table is created with partitioned on field "bqPartitionFieldTime"
61+
62+
@BQ_INSERT_SOURCE_TEST @BQ_UPDATE_SINK_TEST
63+
Scenario:Validate successful records transfer from BigQuery to BigQuery with Advanced Operations Update for table key.
64+
Given Open Datafusion Project to configure pipeline
65+
When Expand Plugin group in the LHS plugins list: "Source"
66+
When Select plugin: "BigQuery" from the plugins list as: "Source"
67+
When Expand Plugin group in the LHS plugins list: "Sink"
68+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
69+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
70+
Then Navigate to the properties page of plugin: "BigQuery"
71+
And Replace input plugin property: "project" with value: "projectId"
72+
Then Override Service account details if set in environment variables
73+
And Replace input plugin property: "datasetProject" with value: "datasetprojectId"
74+
And Replace input plugin property: "referenceName" with value: "reference"
75+
And Replace input plugin property: "dataset" with value: "dataset"
76+
And Replace input plugin property: "table" with value: "bqSourceTable"
77+
Then Click on the Get Schema button
78+
Then Validate "BigQuery" plugin properties
79+
And Close the Plugin Properties page
80+
Then Navigate to the properties page of plugin: "BigQuery2"
81+
Then Replace input plugin property: "project" with value: "projectId"
82+
Then Override Service account details if set in environment variables
83+
Then Enter input plugin property: "datasetProject" with value: "projectId"
84+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
85+
Then Enter input plugin property: "dataset" with value: "dataset"
86+
Then Enter input plugin property: "table" with value: "bqTargetTable"
87+
And Select radio button plugin property: "operation" with value: "update"
88+
Then Click plugin property: "updateTableSchema"
89+
Then Click on the Add Button of the property: "relationTableKey" with value:
90+
| TableKey |
91+
Then Validate "BigQuery" plugin properties
92+
And Close the Plugin Properties page
93+
Then Save the pipeline
94+
Then Preview and run the pipeline
95+
Then Wait till pipeline preview is in running state
96+
Then Open and capture pipeline preview logs
97+
Then Verify the preview run status of pipeline in the logs is "succeeded"
98+
Then Close the pipeline logs
99+
Then Close the preview
100+
Then Deploy the pipeline
101+
Then Run the Pipeline in Runtime
102+
Then Wait till pipeline is in running state
103+
Then Open and capture logs
104+
Then Close the pipeline logs
105+
Then Verify the pipeline status is "Succeeded"
106+
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table
107+
108+
@BQ_INSERT_SOURCE_TEST @BQ_SINK_TEST
109+
Scenario:Validate successful records transfer from BigQuery to BigQuery with Advanced operations Upsert
110+
Given Open Datafusion Project to configure pipeline
111+
When Expand Plugin group in the LHS plugins list: "Source"
112+
When Select plugin: "BigQuery" from the plugins list as: "Source"
113+
When Expand Plugin group in the LHS plugins list: "Sink"
114+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
115+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
116+
Then Navigate to the properties page of plugin: "BigQuery"
117+
And Replace input plugin property: "project" with value: "projectId"
118+
Then Override Service account details if set in environment variables
119+
And Replace input plugin property: "datasetProject" with value: "datasetprojectId"
120+
And Replace input plugin property: "referenceName" with value: "reference"
121+
And Replace input plugin property: "dataset" with value: "dataset"
122+
And Replace input plugin property: "table" with value: "bqSourceTable"
123+
Then Click on the Get Schema button
124+
Then Validate "BigQuery" plugin properties
125+
And Close the Plugin Properties page
126+
Then Navigate to the properties page of plugin: "BigQuery2"
127+
Then Replace input plugin property: "project" with value: "projectId"
128+
Then Override Service account details if set in environment variables
129+
Then Enter input plugin property: "datasetProject" with value: "projectId"
130+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
131+
Then Enter input plugin property: "dataset" with value: "dataset"
132+
Then Enter input plugin property: "table" with value: "bqTargetTable"
133+
And Select radio button plugin property: "operation" with value: "upsert"
134+
Then Click plugin property: "updateTableSchema"
135+
Then Click on the Add Button of the property: "relationTableKey" with value:
136+
| TableKey |
137+
Then Validate "BigQuery" plugin properties
138+
And Close the Plugin Properties page
139+
Then Save the pipeline
140+
Then Preview and run the pipeline
141+
Then Wait till pipeline preview is in running state
142+
Then Open and capture pipeline preview logs
143+
Then Verify the preview run status of pipeline in the logs is "succeeded"
144+
Then Close the pipeline logs
145+
Then Close the preview
146+
Then Deploy the pipeline
147+
Then Run the Pipeline in Runtime
148+
Then Wait till pipeline is in running state
149+
Then Open and capture logs
150+
Then Close the pipeline logs
151+
Then Verify the pipeline status is "Succeeded"
152+
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table
153+
154+
@BQ_SOURCE_DATATYPE_TEST @BQ_SINK_TEST
155+
Scenario:Validate successful records transfer from BigQuery to BigQuery with clustering order functionality
156+
Given Open Datafusion Project to configure pipeline
157+
When Expand Plugin group in the LHS plugins list: "Source"
158+
When Select plugin: "BigQuery" from the plugins list as: "Source"
159+
When Expand Plugin group in the LHS plugins list: "Sink"
160+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
161+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
162+
Then Navigate to the properties page of plugin: "BigQuery"
163+
And Enter input plugin property: "referenceName" with value: "Reference"
164+
And Replace input plugin property: "project" with value: "projectId"
165+
Then Override Service account details if set in environment variables
166+
Then Override Service account details if set in environment variables
167+
And Enter input plugin property: "datasetProject" with value: "datasetprojectId"
168+
And Replace input plugin property: "dataset" with value: "dataset"
169+
And Replace input plugin property: "table" with value: "bqSourceTable"
170+
Then Click on the Get Schema button
171+
Then Validate "BigQuery" plugin properties
172+
And Close the Plugin Properties page
173+
Then Navigate to the properties page of plugin: "BigQuery2"
174+
Then Replace input plugin property: "project" with value: "projectId"
175+
Then Override Service account details if set in environment variables
176+
Then Enter input plugin property: "datasetProject" with value: "projectId"
177+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
178+
Then Enter input plugin property: "dataset" with value: "dataset"
179+
Then Enter input plugin property: "table" with value: "bqTargetTable"
180+
Then Enter BigQuery sink property partition field "bqPartitionFieldTime"
181+
Then Click on the Add Button of the property: "clusteringOrder" with value:
182+
| clusterValue |
183+
Then Validate "BigQuery" plugin properties
184+
Then Close the BigQuery properties
185+
Then Save the pipeline
186+
Then Preview and run the pipeline
187+
Then Wait till pipeline preview is in running state
188+
Then Open and capture pipeline preview logs
189+
Then Verify the preview run status of pipeline in the logs is "succeeded"
190+
Then Close the pipeline logs
191+
Then Close the preview
192+
Then Deploy the pipeline
193+
Then Run the Pipeline in Runtime
194+
Then Wait till pipeline is in running state
195+
Then Open and capture logs
196+
Then Verify the pipeline status is "Succeeded"
197+
Then Verify the partition table is created with partitioned on field "bqPartitionFieldTime"

src/e2e-test/features/bigquery/source/BigQueryToBigQuery.feature

Lines changed: 44 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -218,4 +218,47 @@ Feature: BigQuery source - Verification of BigQuery to BigQuery successful data
218218
Then Wait till pipeline is in running state
219219
Then Open and capture logs
220220
Then Verify the pipeline status is "Succeeded"
221-
Then Validate records transferred to target table is equal to number of records from source table
221+
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table
222+
223+
224+
@BQ_SOURCE_TEST @BQ_SOURCE_VIEW_TEST @BQ_SINK_TEST
225+
Scenario:Validate successful records transfer from BigQuery to BigQuery by enable querying views
226+
Given Open Datafusion Project to configure pipeline
227+
When Expand Plugin group in the LHS plugins list: "Source"
228+
When Select plugin: "BigQuery" from the plugins list as: "Source"
229+
When Expand Plugin group in the LHS plugins list: "Sink"
230+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
231+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
232+
Then Navigate to the properties page of plugin: "BigQuery"
233+
And Enter input plugin property: "referenceName" with value: "Reference"
234+
And Replace input plugin property: "project" with value: "projectId"
235+
And Enter input plugin property: "datasetProject" with value: "datasetprojectId"
236+
And Replace input plugin property: "dataset" with value: "dataset"
237+
Then Override Service account details if set in environment variables
238+
And Enter input plugin property: "table" with value: "bqSourceTable"
239+
Then Click on the Get Schema button
240+
Then Validate "BigQuery" plugin properties
241+
And Close the Plugin Properties page
242+
Then Navigate to the properties page of plugin: "BigQuery2"
243+
Then Replace input plugin property: "project" with value: "projectId"
244+
Then Enter input plugin property: "datasetProject" with value: "projectId"
245+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
246+
Then Enter input plugin property: "dataset" with value: "dataset"
247+
Then Enter input plugin property: "table" with value: "bqTargetTable"
248+
Then Click plugin property: "truncateTable"
249+
Then Click plugin property: "updateTableSchema"
250+
Then Validate "BigQuery" plugin properties
251+
Then Close the BigQuery properties
252+
Then Save the pipeline
253+
Then Preview and run the pipeline
254+
Then Wait till pipeline preview is in running state
255+
Then Open and capture pipeline preview logs
256+
Then Verify the preview run status of pipeline in the logs is "succeeded"
257+
Then Close the pipeline logs
258+
Then Close the preview
259+
Then Deploy the pipeline
260+
Then Run the Pipeline in Runtime
261+
Then Wait till pipeline is in running state
262+
Then Open and capture logs
263+
Then Verify the pipeline status is "Succeeded"
264+
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table

src/e2e-test/features/bigquery/source/BigQueryToGCS.feature

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -75,6 +75,7 @@ Feature: BigQuery source - Verification of BigQuery to GCS successful data trans
7575
Then Open and capture logs
7676
Then Verify the pipeline status is "Succeeded"
7777
Then Verify data is transferred to target GCS bucket
78+
Then Validate the values of records transferred to GCS bucket is equal to the values from source BigQuery table
7879

7980
@BQ_SOURCE_TEST @BQ_SOURCE_VIEW_TEST @GCS_SINK_TEST
8081
Scenario:Validate successful records transfer from BigQuery to GCS by enable querying views

0 commit comments

Comments
 (0)