
Please consider contributing an improvement, they are always welcome :) The Airflow project documentation is unfortunately not very clear in this case.
#TRIGGER AIRFLOW DAG VIA API FULL#
In this case, the value of endpoint for your Operator should be api/employees (not the full URL). The rest it will get from the underlying http_hook. You should only be passing the relative part of your URL to the operator. Lastly, according to the http_operator documentation: endpoint (str) – The relative part of the full url.
#TRIGGER AIRFLOW DAG VIA API CODE#
See the source code to get a better idea how the Connection object is used. If you don't want to change the default one you can create another Airflow Connection using the Airflow UI, and pass the new conn_id argument to your operator. It so happens that for the httpHook, you should configure the Connection by setting the host argument equal to the base_url of your endpoint: Since your operator has the default http_conn_id, the hook will use the Airflow Connection called "http_default" in the Airflow UI.

So in this case, you need to first configure your HTTP Connection.įrom the http_hook documentation: http_conn_id (str) – connection that has the base API url i.e You can configure Connections in the Airflow UI (using the Airflow UI -> Admin -> Connections). The Hook fetches connection information from an Airflow Connection which is just a container used to store credentials and other connection information. You need to consider both the Operator you are using and the underlying Hook which it uses to connect. Ssl.SSLError: ("bad handshake: SysCallError(-1, 'Unexpected EOF')",)Īirflow is running on Docker and the docker image is puckel/docker-airflow. Raise ssl.SSLError("bad handshake: %r" % e) Return context.wrap_socket(sock, server_hostname=server_hostname)įile "/usr/local/lib/python3.7/site-packages/urllib3/contrib/pyopenssl.py", line 491, in wrap_socket : (-1, 'Unexpected EOF')ĭuring handling of the above exception, another exception occurred:įile "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 672, in urlopenįile "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 376, in _make_requestįile "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 994, in _validate_connįile "/usr/local/lib/python3.7/site-packages/urllib3/connection.py", line 394, in connectįile "/usr/local/lib/python3.7/site-packages/urllib3/util/ssl_.py", line 370, in ssl_wrap_socket

(Caused by SSLError(SSLError("bad handshake: SysCallError(-1,įile "/usr/local/lib/python3.7/site-packages/urllib3/contrib/pyopenssl.py", line 485, in wrap_socketįile "/usr/local/lib/python3.7/site-packages/OpenSSL/SSL.py", line 1934, in do_handshakeįile "/usr/local/lib/python3.7/site-packages/OpenSSL/SSL.py", line 1664, in _raise_ssl_error HTTPSConnectionPool(host='port=443): Max retries exceeded with url: t1 = SimpleHttpOperator(Įndpoint=' data=json.dumps( ERROR.

What I'm doing is using SimpleHttpOperator to call the Rest end point. Now I want to call this rest end point using Airflow DAG, and schedule it. REST end point for example = "/api/employees", consumes = "application/json") I want to call a REST end point using DAG.
