
From Zero to Production: Orchestrating LLM workflows with the Airflow AI SDK
We’ll introduce the open-source AI SDK for Apache Airflow, built to simplify the creation of production-ready LLM workflows.
Discover upcoming live and on-demand webinars covering the latest on Astronomer and Airflow.
We’ll introduce the open-source AI SDK for Apache Airflow, built to simplify the creation of production-ready LLM workflows.
In this webinar, we’ll cover everything you need to know to plan your upgrade path from Airflow 2 to 3, so you can start taking advantage of the new features as soon as possible.
In this session, we’ll break down these architectural changes and show you how to use these new features to run your tasks anywhere
Learn how Airflow 3.0 is expanding on assets, the foundational datasets feature, by introducing asset watchers, the most powerful option for event-driven scheduling in Airflow to date
Explore how new features in Airflow 3.0, including DAG versioning and backfills, can improve your productivity and help you manage your pipelines more efficiently. Whether you are an experienced Airflow user or just getting started, you’ll learn how to quickly troubleshoot and manage your pipeline history.
Be among the first to see Airflow 3.0 in action and get your questions answered directly by the Astronomer team.
In this session hosted with DZone, we’ll take a practical look at the essential security measures that not only support the performance of data products but also ensure a seamless path to production—from managing sensitive credentials to keeping pace with evolving compliance demands.
Join us to learn why top data teams are choosing Astro on Azure instead of Microsoft Fabric for their mission-critical Airflow workloads.
Join us to discover how Airflow helps you deliver workflows faster, minimize downtime, and scale your operations effortlessly. Plus, you’ll see how Astronomer’s Orbiter tool takes the complexity out of migration and allows you to modernize your data operations with ease.
In this webinar, we’ll explore how to take full advantage of datasets and data-aware scheduling in Apache Airflow.
Our panel of experts will discuss the state of Airflow in 2025, including trends and results from the recent annual Airflow survey.
By the end of this webinar, you’ll be ready to take the next step with Astro Observe, now compatible with OSS Airflow, Astro, Amazon Managed Workflows (MWAA), and Google Cloud Composer (GCC).
In this session, you’ll learn everything you need to know about using CI/CD to manage your Airflow DAGs.
In this session, we’ll introduce how you can enhance your data workflows by running Airflow with Astro.
In this webinar, we’ll cover best practices for running Airflow at scale so you can provide your teams with a robust, reliable Airflow service.
In this webinar hosted with Data Engineering Digest, Tamara Fingerlin, Developer Advocate at Astronomer, will walk you through many Airflow best practices and advanced features that can help you make your pipelines more manageable, adaptive, and robust.
In this webinar, you'll learn how to easily and effectively debug your DAGs when an error inevitably occurs, and implement testing so issues don’t make it to production.
In this webinar, you’ll learn how to use Airflow to orchestrate your Databricks Jobs by using the Airflow Databricks Provider.
In this webinar, we’ll give an overview of data observability for Airflow, including topics like tracking data lineage, monitoring data quality, and insights into the performance of your data operations.
Whether you’re writing traditional ELT/ETL pipelines or complex ML workflows, you’ll learn how to make Airflow work best for your use case.
In this webinar, we’ll cover DAG writing best practices applicable to ETL and ELT pipelines, including things like how to approach building ETL and ELT pipelines from scratch.
In this webinar, we’ll discuss Cosmos and Astro features in depth, including native support for installing and running dbt in a virtual environment to avoid dependency conflicts with Airflow.
In this webinar, we’ll cover everything you need to know to orchestrate ETL operations in Snowflake with Airflow.
We’ll cover how we think about data products at Astronomer and highlight best practices learned from the most critical data product applications we’ve seen.
The release of Airflow 2.10 brings greater flexibility and expansion of some of the most widely used features. In this webinar, we’ll cover all the significant 2.10 updates that you won’t want to miss.
In this webinar, Astronomer and Weaviate cover how they are providing modern infrastructure for world class AI applications, and how to best build production-quality applications with the latest advances in ML and AI.
The latest release of Astro, Astronomer’s fully managed Airflow platform, includes new features that let you unify workloads, improve pipeline resilience, and automate infrastructure provisioning and management.
This webinar covers a step-by-step guide to Cosmos, an open source package from Astronomer that helps you easily run your dbt Core projects as Airflow DAGs and Task Groups, all with just a few lines of code.
This webinar is designed for those new to Airflow, providing a thorough introduction to its core concepts and practical applications.
In this webinar, Matt Shancer, Staff Data Engineer at Grindr, will discuss how they implemented their Snowflake monitoring solution using Astronomer.
Join us to deepen your understanding of CI/CD and learn best practices for more efficient data pipeline management.
In this webinar, resident AI/ML experts from AWS and Astronomer explore top strategies to harness the potential of AI and natural language processing to drive innovation.
In this webinar, you’ll learn best practices for using the latest Airflow features, including those recently released in Airflow 2.9, for generative AI and general machine learning use cases.
In this webinar, we’ll cover all the significant 2.9 updates that you won’t want to miss. We’ll spend extra time on updates that expand the functionality of some of Airflow’s most widely used features.
The latest release of Astro, Astronomer’s fully managed Airflow platform, contains exciting new features that let you deploy faster, and give you greater operational efficiency, and increased visibility, security, and growth potential.
In this webinar, we’ll cover everything you need to know about using Airflow and ADF together.
Join us for an in-depth analysis of the 2023 Apache Airflow® survey, where our panel of experts explore insights into how Airflow is used across industries for data orchestration and workflow management.
In this webinar, we’ll cover best practices for running Airflow at scale in production so you can provide your teams with a robust, reliable Airflow service.
In this webinar, we’ll cover DAG writing best practices applicable to data engineers and data scientists.
In this webinar, we will cover everything you need to know about managing connections in Airflow.
In this webinar, we’ll cover all the significant Apache Airflow® 2.8 updates that make Airflow more flexible and easy to use.
A paradigm shift occurred leading to the formation of the large language model operations (LLMOps) modern data stack with Apache Airflow® right at the center. Join us in this webinar to explore these providers and learn how you can leverage them to take your LLMOps to the next level.
This webinar will provide a comprehensive overview of the new features in the latest Astro release, complete with live demos. Whether you're already using Astro or considering it for your data orchestration needs, don't miss this opportunity to learn from the experts and have your questions answered.
In this webinar, we’ll dive into this story with Faire and discuss how Faire leverages Airflow to power ML training and extends it with a framework that powers feature stores.
Join us for this live webinar on November 29th to learn how Apache Airflow® on Astro - An Azure Native ISV Service provides an enterprise-grade Apache Airflow platform.
Data professionals all over the world use Airflow for MLOps, LLMOps, ELT/ETL, operationalized analytics, and more. In this webinar, we’ll cover everything you need to get started with Airflow,
Join us in this webinar where we’ll show how to use the Ask Astro LLMOps reference architecture to create your own retrieval augmented LLM application, feeding state of the art models with domain specific knowledge.
In this webinar, you will get a behind-the-scenes look at how Laurel accelerated their AI journey with Airflow to stand at the forefront of ML innovations.
In this webinar, we’ll show how to level-up your Airflow usage by leveraging both the Airflow API and Astro API to automate common workflow management tasks. If you find yourself frequently completing the same steps in the Airflow UI and want greater efficiency, this is the webinar for you!
In this webinar, we’ll show how to use the new Weaviate provider to make the most of Airflow’s orchestration power for AI and ML use cases.
In this webinar, we’ll show how the Astro Cloud IDE is the easiest way to develop and test your ML pipelines and schedule them with Airflow.
In this webinar, we’ll cover everything you need to know about setup/teardown tasks and how to use them to support your data quality checks.
In this webinar, you will get a behind-the-scenes look at how the Texas Rangers, a team at the forefront of this transformation, are using big data to deliver real-time game analytics and gain a competitive advantage.
In this webinar, we’ll cover all the great 2.7 updates that make it even easier to implement your pipelines in Airflow.
In this free, virtual workshop, you’ll get hands-on experience setting up a data ingestion pipeline with Fivetran and Astronomer.
In this webinar, we’ll show how you can use CI/CD with Airflow to manage the deployment and testing of your DAGs. We’ll discuss the benefits of CI/CD for data pipelines, how to select an appropriate deployment strategy, and more.
Recent releases of the Astro CLI have greatly expanded its functionality, bringing additional features for debugging, testing, and upgrading DAGs. In this webinar, we’ll show you how to get started with the Astro CLI and how to take advantage of these new features.
In this webinar, we’ll show you how Cosmos can be used to dynamically generate Airflow DAGs from your dbt models.
In this webinar, we’ll cover everything you need to know to use task groups both in simple and advanced ways.
Build data pipelines effortlessly! Discover Astro Cloud IDE's simple notebook interface, custom operator creation, and direct deployment. Start coding now!
In this webinar, we’ll cover everything you need to know to use both simple and advanced scheduling methods
In preparation for the upcoming Snowflake Summit, we’re excited to host a webinar on the best ways to use Snowpark with Apache Airflow®.
Sophi.io, a leading AI-powered content optimization platform, faced scale challenges as their data pipeline grew using Apache Airflow®. Learn how Sophi optimized their data pipelines after migrating from Amazon Managed Workflows for Apache Airflow® (MWAA) to Astronomer and dramatically improved their data pipeline scalability and reliability.
Apache Airflow® is an open source tool for authoring, scheduling, and monitoring your data pipelines In this webinar, we’ll cover everything you need to get started with Apache Airflow®.
In this webinar, we’ll show how Airflow’s notifications and Astro’s deployment and pipeline monitoring and alerting ensure you always know what’s going on in your pipelines.
In this webinar, we’ll give an overview of the Astro SDK and how it can be used to create ELT pipelines with integrated data quality checks in Python or SQL. We’ll cover everything you need to get started using this open source tool to ensure errant data never ends up in production.
In this webinar, we’ll cover everything you need to know about managing Airflow versions with Astro Runtime.
In this “Live with Astronomer” session, we’ll give an overview of the Astro SDK and show how it makes writing ELT pipelines easy and efficient. We’ll cover everything you need to get started using this open source tool in just a few minutes.
In this webinar, we’ll cover the new, open-source Astronomer Cosmos tool, a framework for dynamically generating Airflow DAGs from your dbt models that massively simplifies the Airflow-dbt integration.
In this “Live with Astronomer” session, we’ll dive into the best ways to use DuckDB with Airflow.
In this webinar, we’ll cover the implementation and nuances of different methods for passing data between your Airflow tasks.
We’ll dive into the easiest ways to test and debug your Airflow connections. We’ll show how the new `dag.test()` function allows you to test connections without even running Airflow, how new Airflow UI features make testing connections faster, and how to solve common issues that arise when working with connections.
Teams can run Airflow themselves, go with a managed infrastructure option like MWAA and GCC, or choose Astro. Whether you’ve been running Airflow in production for years or if you’re just thinking about the right Airflow strategy is for your team, we’ll go through all the options available.
Join the Astronomer team to learn more about upcoming Astro features and our product roadmap.
In this webinar on machine learning orchestration with Apache Airflow®, we’ll provide an overview of machine learning orchestration, its importance in managing ML workflows, and how it fits in with MLOps.
In this “Live with Astronomer” session, we’ll dive into Cosmos, an open-source framework developed by Astronomer for dynamically generating Airflow DAGs from other tools.
In this webinar, we’ll dive into the newly updated Fivetran provider and discuss the benefits of using Airflow and Fivetran together for your ELT pipelines. We’ll cover how to implement the available Fivetran operators to orchestrate sync jobs, how to leverage asynchronous functionality for cost savings, and how to use lineage generated from Fivetran tasks to get insight into your data pipelines.
In this “Live with Astronomer” session, we’ll dive into the new `dag.test()` function, which allows you to debug DAGs directly in your IDE in a single, serialized Python process. We’ll show how this function lets you quickly iterate and debug errors during your DAG development, and even test argument-specific DAG runs.
In this webinar, Airflow Engineering Advocate Benji Lampel will demonstrate the new features of the Great Expectations Operator, which make Great Expectations more Airflow-centric and simpler to use. The latest set of releases under the new repository hosted by Astronomer provides some dramatic changes, including a default Checkpoint feature. The webinar will feature a demo of the new operator and how to use these features.
In this “Live with Astronomer” session, we’ll dive into the newly developed asynchronous Azure operators that offer cost savings and greater scalability. We’ll show how with only small updates to your DAGs, you can take advantage of asynchronous functionality when orchestrating services like Azure Data Factory and Azure Databricks.
In this webinar, we’ll cover best practices for implementing important open-source features and frameworks that will simplify the DAG-authoring process.
In this “Live with Astronomer” session, we’ll walk through one of those contributions – new Airflow Templates VS Code extension, which includes code completion for all Airflow Provider operators.
In this webinar, you'll learn how to easily and effectively debug your DAGs when an error inevitably occurs.
With more than 30,000 downloads per month, the Fivetran provider for Airflow is incredibly popular. Using Fivetran and Airflow together gives users the benefits of first-class orchestration, pipelines as code, and automated ELT processes.
In this webinar, we’ll cover all the great 2.5 updates that make Airflow even easier to use and implement at scale.
One of the benefits of Airflow is having pipelines as Python code, which lets you treat your data pipelines like any other piece of software. In this “Live with Astronomer” session, we’ll dive into how to use the open-source Astro CLI to effectively manage your Airflow project code so you can share code with your team, test DAGs before you deploy them, keep your code organized for easy reviews, and more.
What is Airflow? Apache Airflow® is a platform used to programmatically author, schedule, and monitor data pipelines.
Live with Astronomer dives into implementing data-aware scheduling with the Astro Python SDK. The new Airflow Datasets feature allows you to schedule DAGs based on updates to your data and easily view cross-DAG relationships. This feature is part of the Astro Python SDK, so it requires almost no effort from the DAG author to implement. We'll show you everything you need to do (and don't need to do) to take advantage of Datasets.
Running tasks in a separate environment can help you avoid common data pipeline issues, like dependency conflicts or out-of-memory errors, and it can save resources. Airflow DAG authors have multiple options for running tasks in isolated environments. In this webinar, we'll cover everything you need to know.
Migrating between orchestrators can be a difficult process fraught with technical and organizational hurdles. However, the end result of applying Airflow’s orchestration capabilities is worth the effort, and working with the right partner can make this journey much easier. In this webinar, we’ll cover everything you need to know about migrating from Oozie to Airflow.
Live with Astronomer will discuss the new consolidated `schedule` parameter introduced in Airflow 2.4. We’ll provide a quick refresher of scheduling concepts and discuss how scheduling DAGs is easier and more powerful in newer versions of Airflow.
We’ll walk through using the new SageMaker Async Operators, as well as the new SageMaker OpenLineage integration, for end-to-end ML OPs for batch inference use cases.
On October 25, Live with Astronomer will dive into updates to the dynamic task mapping feature released in Airflow 2.4. We’ll show a couple of new methods for mapping over multiple parameters, and discuss how to choose the best mapping method for your use case.
With the releases of Airflow 2.3 and 2.4, users can write DAGs that dynamically generate parallel tasks at runtime. In this webinar, we’ll cover everything you need to know to implement dynamic tasks in your DAGs.
In this session, Live with Astronomer explores the new datasets feature introduced in Airflow 2.4. We’ll show how DAGs that access the same data now have explicit, visible relationships, and how DAGs can be scheduled based on updates to these datasets.
In this webinar, we’ll cover all the great 2.4 updates that make Airflow even more powerful and observable.
In this webinar, we’ll take an in-depth tour of the Airflow UI and cover the many features that users may not be aware of.
On September 13, Live with Astronomer will dive into implementing data transformations with the Astro Python SDK. The Astro Python SDK is an open source Python package that allows for clean and rapid development on ELT workflows. We’ll show how you can use the transform and dataframe functions to easily transform your data using Python or SQL and seamlessly transition between the two.
Executing SQL queries — one of the most common use cases for data pipelines — is a simple way to implement data quality checks. In this webinar, we’ll cover everything you need to know about using SQL for data quality checks.
The next Live with Astronomer will dive into the Astro Python SDK load_file function. The Astro Python SDK is an open source Python package that allows for clean and rapid development on ELT workflows. We’ll show how you can use load_file for the ‘Extract’ step of your pipeline to easily get data from your filesystems into your data warehouse, without any operator-specific knowledge.
Astronomer is excited to announce the release of the Astro Python SDK version 1.0. The Astro Python SDK is an open source tool powered by Airflow and maintained by Astronomer, that allows for rapid and clean development of ETL workflows using Python.
In this session we’ll dive into the new Common SQL provider package and show how to use the SQLTableCheckOperator. We’ll show how you can easily use this operator to implement data quality checks in your DAGs, ensuring that errant data never makes it to production.
What is Airflow? Apache Airflow® is a platform used to programmatically author, schedule, and monitor data pipelines.
In this session we’ll show how you can easily use the SQLColumnCheckOperator operator to implement data quality checks in your DAGs, ensuring that errant data never makes it to production.
We’ll delve further into how Airflow can be integrated with Tensorflow and MLFlow specifically to manage ML pipelines in production, using a worked example to demonstrate.
In this session we’ll show how Astronomer’s data and intelligence team uses TaskGroups to reduce the amount of code the team has to write while adhering to DAG authoring best practices.
Operators are the building blocks of Apache Airflow®. In this webinar we’ll look under the hood, covering everything you need to know about operators to tailor them for your use cases.
Live with Astronomer will dive into using the Snowflake Deferrable Operator. We’ll show how with a very small update to your DAGs, you can start saving money when orchestrating your Snowflake queries with Airflow.
In this webinar, we’ll demystify decorators and show you everything you need to know to start using decorators in your DAGs.
Live with Astronomer will dive into the Python task decorator. We’ll show how to easily turn your Python functions into tasks in your DAG using functional programming, and how using the Python task decorator can limit the boilerplate code needed in your DAGs.
What is Airflow? Apache Airflow® is a platform used to programmatically author, schedule, and monitor data pipelines.
On May 24, Live with Astronomer will dive into the Dynamic Task Mapping feature introduced in Airflow 2.3. We’ll show how to easily add dynamic tasks to your DAGs, and discuss ways to make the best use of this feature.
This webinar will dive into the Astronomer Providers repository, which includes Airflow Providers containing Deferrable Operators and Sensors created by Astronomer. We’ll go beyond the basics to look at key implementation details and best practices.
The Airflow project is rapidly evolving, with frequent releases bringing advancements in DAG authoring, observability, and project stability. We’re super excited for the release of Airflow 2.3, which comes with big changes in the flexibility of DAG creation, improvements to the Airflow UI, and much more.
Airflow is sometimes thought of as primarily a data engineering tool, but its use cases are really much broader. A data analyst’s workflow typically involves ingesting and transforming data to extract insights, then presenting the insights in a manner that allows business stakeholders to easily interpret trends and take appropriate action. Airflow’s ease of use and extensive provider ecosystem make it an ideal tool for orchestrating such analytics workflows.
Data lineage is the complex set of relationships between your jobs and datasets. Using OpenLineage with Apache Airflow®, you can observe and analyze these relationships, allowing you to find and fix issues more quickly. This webinar will provide a deeper dive on OpenLineage, extending beyond the basics into key implementation details and best practices.
Apache Airflow® is flexible and powerful. It has a rich ecosystem and an incredibly active community. But are you sure you haven’t missed anything? A new feature or concept that could put your DAGs at another level? It can be challenging to keep up with the latest Airflow features, and sometimes we miss the most useful ones. For this webinar, I'd like to introduce you to a couple of lesser-known features of Apache Airflow® that can dramatically improve your data pipelines.
Airflow is purpose-built for high-scale workloads and high availability on a distributed platform. Since the advent of Airflow 2.0, there are even more tools and features to ensure that Airflow can be scaled to accommodate high-throughput, data-intensive workloads. In this webinar, Alex Kennedy will discuss the process of scaling out Airflow utilizing the Celery and Kubernetes Executor, including the parameters that need to be tuned when adding nodes to Airflow and the thought process behind deciding when it’s a good idea to scale Airflow, horizontally and vertically. Consistent and aggregated logging is key when scaling Airflow, and we will also briefly discuss best practices for logging on a distributed Airflow platform, as well as the pitfalls that many Airflow users experience when designing and building their distributed Airflow platform.
At this webinar, Benji Lampel (Enterprise Platform Architect @ Astronomer) and Tal Gluck (Software Engineer @ Superconductive) will present several Airflow DAGs using Great Expectations that cover more advanced DAG patterns and data quality checking cases.
Did you know that Airflow has a fully stable REST API? In this webinar, we’ll cover how to use the API, and why it’s a great tool in your Airflow toolbox for managing and monitoring your data pipelines.
With Airflow 2.0, we introduced the concept of providers. We’re taking that to the next level with Astro, a new DAG writing experience, brought to you by Astronomer.
If one out of your hundreds of DAGs fails, how do you know which downstream datasets have become out-of-date? The answer is data lineage. Data lineage is the complex set of relationships between your jobs and datasets. In this webinar, you'll learn how to use OpenLineage to collect lineage metadata from Airflow and assemble a lineage graph - a picture of your pipeline worth way more than a thousand words.
Because Airflow is 100% code, knowing the basics of Python is all it takes to get started writing DAGs. However, writing DAGs that are efficient, secure, and scalable requires some Airflow-specific finesse. In this webinar, you’ll learn the best practices for writing DAGs that will ensure you get the most out of Airflow. We’ll include a reference repo with DAGs you can run yourself with the Astro CLI.
Data quality is an often overlooked component of data pipelines. Learn why it is a valuable part of data systems and how to get started integrating data quality checks into existing pipelines with a variety of tools.
With Apache Airflow®'s orchestration capabilities, manage your data like never before. Discover how to leverage Airflow in any company.
The flexibility and freedom that Airflow offers you is incredible, but to really take advantage of it you need to master some concepts first, one of which has just been released in Airflow 2.2 By the end of the webinar, you will be able to define schedule intervals that you thought were impossible before.
In this informative webinar we will cover everything you need to know about Airflow 2.2. We'll go through all of the new features large and small, as well as show you how to leverage all of the new features and how you can get cleaner and more efficient DAGs as a result
Airflow, by nature, is an orchestration framework, not a data processing framework. At first sight it can be unclear how to test Airflow code. Are you triggering DAGs in the UI to validate your Airflow code? In this webinar we'll demonstrate various examples how to test Airflow code and integrate tests in a CI/CD pipeline, so that you're certain your code works before deploying to production.
More often that not, your Airflow components will have a desired order of execution particularly if you are performing a traditional ETL process—for example, before the Transform step in ETL, Extraction had to have happened in an upstream pipeline. In this webinar we will discuss how to properly setup dependencies and define an order of execution or operation for your pipelines using dependencies.
Do you use Sensors in your data pipelines? Do you need to wait for a file before executing the next step? Are you looking to execute your task after a task completes in another DAG? Would you like to wait for an import in your SQL table before executing the next task?
While Airflow and ADF (Azure Data Factory) have pros and cons, they can be used in tandem for data pipelines across your organization. In this webinar, we’ll cover how using the two together can really get you the best of both worlds!
Anytime you’re running business critical pipelines, you need to know when something goes wrong. Airflow has a built in notification system that can be used to throw alerts when your DAGs fail, succeed, or anything in between. In this webinar, we’ll do a deep dive into how you can customize your notifications in Airflow to meet your needs.
ETL is one of the most common data engineering use cases, and it's one where Airflow really shines. In this webinar, we'll cover everything you need to get started as a new Airflow user, and dive into how to implement ETL pipelines as Airflow DAGs.
The official helm chart of Apache Airflow® is out! The days of wondering what Helm Chart to use in production are over. Now, you only have one chart maintained and tested by Airflow PMC members as well as the community. It’s time to get your hands on it and take it for a spin! At the end of the webinar, you will have a fully functional Airflow instance deployed with the Official Helm Chart and running within a Kubernetes cluster locally.
In this webinar, we'll talk about when you might want to dynamically generate your DAGs, show a couple of methods for doing so, and discuss problems that can arise when implementing dynamic generation at scale.
In AWS, it's common for organizations to use multiple AWS accounts for various reasons, from Dev, Stage, Prod accounts to accounts being dedicated to LOBs. What do you do when your Data Pipeline needs to span AWS accounts? This webinar will show how you can run a single DAG across multiple AWS accounts in a secure manner.
Learn about the core concepts, components, and benefits of working with Airflow. Watch this Intro to Airflow webinar today!
Learn more about using Airflow 2.0 with Kubernetes.
Learn everything about Airflow 2.0 providers including what defines a provider, how to create your own provider, and customizing provider packages.
Watch the webinar recap and learn how Taskflow API can help simplify DAGs that make heavy use of Python tasks and XComs.
Watch the webinar recording to learn the best practices for managing secrets with various backends in Apache Airflow® 2.0.
Learn the best practices for writing DAGs in Apache Airflow® with a repo of example DAGs that you can run with the Astro CLI.
Try Astro today and get up to $500 in free credits during your 14-day trial.