Bitbucket (self-hosted)
Port's Bitbucket (Self-Hosted) integration allows you to model Bitbucket Server / Bitbucket Data Center resources in your software catalog and ingest data into them.
Note: While Bitbucket Server has been deprecated and replaced by Bitbucket Data Center, they expose the same set of APIs and this integration covers both.
This documentation covers Port's integration with Bitbucket (Self-Hosted). For information about integrating with Bitbucket Cloud, please refer to the Bitbucket Cloud integration documentation.
Overview
This integration allows you to:
- Map and organize your desired Bitbucket resources and their metadata in Port (see supported resources below).
- Watch for Bitbucket object changes (create/update/delete) in real-time, and automatically apply the changes to your software catalog.
Supported resources
The resources that can be ingested from Bitbucket (Self-Hosted) into Port are listed below.
It is possible to reference any field that appears in the API responses linked below in the mapping configuration.
Setup
Choose one of the following installation methods:
- Hosted by Port
- Real-Time (self-hosted)
- Scheduled (CI)
Using this installation option means that the integration will be hosted by Port, with a customizable resync interval to ingest data into Port.
Live event support
This integration supports live events, allowing real-time updates to your software catalog without waiting for the next scheduled sync.
Supported live event triggers
Pull Request:
- pr:modified
- pr:opened
- pr:merged
- pr:reviewer:updated
- pr:declined
- pr:deleted
- pr:comment:deleted
- pr:from_ref_updated
- pr:comment:edited
- pr:reviewer:unapproved
- pr:reviewer:needs_work
- pr:reviewer:approved
- pr:comment:added
Repository:
- repo:modified
- repo:refs_changed
Project:
- project:modified
Alternatively, you can install the integration using the Real-time (self-hosted) method to update Port in real time using webhooks.
Installation
To install, follow these steps:
-
Go to the Data sources page of your portal.
-
Click on the
+ Data source
button in the top-right corner. -
Click on the relevant integration in the list.
-
Under
Select your installation method
, chooseHosted by Port
. -
Configure the
integration settings
andapplication settings
as you wish (see below for details).
Application settings
Every integration hosted by Port has the following customizable application settings, which are configurable after installation:
-
Resync interval
: The frequency at which Port will ingest data from the integration. There are various options available, ranging from every 1 hour to once a day. -
Send raw data examples
: A boolean toggle (enabled
by default). If enabled, raw data examples will be sent from the integration to Port. These examples are used when testing your mapping configuration, they allow you to run yourjq
expressions against real data and see the results.
Integration settings
Every integration has its own tool-specific settings, under the Integration settings
section.
Each of these settings has an ⓘ icon next to it, which you can hover over to see a description of the setting.
Port secrets
Some integration settings require sensitive pieces of data, such as tokens.
For these settings, Port secrets will be used, ensuring that your sensitive data is encrypted and secure.
When filling in such a setting, its value will be obscured (shown as ••••••••
).
For each such setting, Port will automatically create a secret in your organization.
To see all secrets in your organization, follow these steps.
Limitations
- The maximum time for a full sync to run is based on the configured resync interval. For very large amounts of data where a resync operation is expected to take longer, please use a longer interval.
Port source IP addresses
When using this installation method, Port will make outbound calls to your 3rd-party applications from static IP addresses.
You may need to add these addresses to your allowlist, in order to allow Port to interact with the integrated service:
- Europe (EU)
- United States (US)
54.73.167.226
63.33.143.237
54.76.185.219
3.234.37.33
54.225.172.136
3.225.234.99
You should toggle the Bitbucket Is Version 8 Point 7 Or Older
option to true
if you are using Bitbucket (Self-Hosted) version 8.7 or older. This is because webhook events are setup differently in Bitbucket (Self-Hosted) 8.8 and above.
Using this installation option means that the integration will be able to update Port in real time using webhooks.
Prerequisites
To install the integration, you need a Kubernetes cluster that the integration's container chart will be deployed to.
Please make sure that you have kubectl
and helm
installed on your machine, and that your kubectl
CLI is connected to the Kubernetes cluster where you plan to install the integration.
If you are having trouble installing this integration, please refer to these troubleshooting steps.
For details about the available parameters for the installation, see the table below.
- Helm
- ArgoCD
To install the integration using Helm:
-
Go to the Bitbucket Self-Hosted data source page in your portal.
-
Select the
Real-time and always on
method: -
A
helm
command will be displayed, with default values already filled out (e.g. your Port client ID, client secret, etc).
Copy the command, replace the placeholders with your values, then run it in your terminal to install the integration.
The port_region
, port.baseUrl
, portBaseUrl
, port_base_url
and OCEAN__PORT__BASE_URL
parameters are used to select which instance or Port API will be used.
Port exposes two API instances, one for the EU region of Port, and one for the US region of Port.
- If you use the EU region of Port (https://app.port.io), your API URL is
https://api.port.io
. - If you use the US region of Port (https://app.us.port.io), your API URL is
https://api.us.port.io
.
To install the integration using ArgoCD:
- Create a
values.yaml
file inargocd/my-ocean-bitbucket-server-integration
in your git repository with the content:
Remember to replace the placeholder for BITBUCKET_USERNAME
, BITBUCKET_PASSWORD
, BITBUCKET_BASE_URL
, BITBUCKET_WEBHOOK_SECRET
, BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER
.
initializePortResources: true
scheduledResyncInterval: 120
integration:
identifier: my-ocean-bitbucket-server-integration
type: bitbucket-server
eventListener:
type: POLLING
config:
bitbucketBaseUrl: BITBUCKET_BASE_URL
bitbucketUsername: BITBUCKET_USERNAME
bitbucketIsVersion8Point7OrOlder: BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER
secrets:
bitbucketPassword: BITBUCKET_PASSWORD
bitbucketWebhookSecret: BITBUCKET_WEBHOOK_SECRET
- Install the
my-ocean-bitbucket-server-integration
ArgoCD Application by creating the followingmy-ocean-bitbucket-server-integration.yaml
manifest:
Remember to replace the placeholders for YOUR_PORT_CLIENT_ID
YOUR_PORT_CLIENT_SECRET
and YOUR_GIT_REPO_URL
.
Multiple sources ArgoCD documentation can be found here.
ArgoCD Application
apiVersion: argoproj.io/v1alpha1
kind: Application
metadata:
name: my-ocean-bitbucket-server-integration
namespace: argocd
spec:
destination:
namespace: my-ocean-bitbucket-server-integration
server: https://kubernetes.default.svc
project: default
sources:
- repoURL: 'https://port-labs.github.io/helm-charts/'
chart: port-ocean
targetRevision: 0.1.14
helm:
valueFiles:
- $values/argocd/my-ocean-bitbucket-server-integration/values.yaml
parameters:
- name: port.clientId
value: YOUR_PORT_CLIENT_ID
- name: port.clientSecret
value: YOUR_PORT_CLIENT_SECRET
- name: port.baseUrl
value: https://api.getport.io
- repoURL: YOUR_GIT_REPO_URL
targetRevision: main
ref: values
syncPolicy:
automated:
prune: true
selfHeal: true
syncOptions:
- CreateNamespace=true
The port_region
, port.baseUrl
, portBaseUrl
, port_base_url
and OCEAN__PORT__BASE_URL
parameters are used to select which instance or Port API will be used.
Port exposes two API instances, one for the EU region of Port, and one for the US region of Port.
- If you use the EU region of Port (https://app.port.io), your API URL is
https://api.port.io
. - If you use the US region of Port (https://app.us.port.io), your API URL is
https://api.us.port.io
.
- Apply your application manifest with
kubectl
:
kubectl apply -f my-ocean-bitbucket-server-integration.yaml
This table summarizes the available parameters for the installation.
Parameter | Description | Required |
---|---|---|
port.clientId | Your port client id | ✅ |
port.clientSecret | Your port client secret | ✅ |
port.baseUrl | Your Port API URL - https://api.getport.io for EU, https://api.us.getport.io for US | ✅ |
integration.identifier | Change the identifier to describe your integration | ✅ |
integration.type | The integration type | ✅ |
integration.eventListener.type | The event listener type | ✅ |
integration.config.bitbucketUsername | Bitbucket username | |
integration.secrets.bitbucketPassword | Bitbucket password | |
integration.secrets.bitbucketWebhookSecret | Bitbucket webhook secret used to verify the webhook request | |
integration.config.bitbucketBaseUrl | Bitbucket base url | ✅ |
integration.config.bitbucketIsVersion8Point7OrOlder | Bitbucket is version 8.7 or older | ❌ |
scheduledResyncInterval | The number of minutes between each resync | ❌ |
initializePortResources | Default true, When set to true the integration will create default blueprints and the port App config Mapping | ❌ |
sendRawDataExamples | Enable sending raw data examples from the third party API to port for testing and managing the integration mapping. Default is true | ❌ |
baseUrl | The base url of the instance where the Bitbucket (Self-Hosted) integration is hosted, used for real-time updates. (e.g.https://mybitbucket-self-hosted-ocean-integration.com ) | ❌ |
Note: You should set the integration.config.bitbucketIsVersion8Point7OrOlder
parameter to true
if you are using Bitbucket (Self-Hosted) version 8.7 or older. This is because webhook events are setup differently in Bitbucket (Self-Hosted) 8.7 and above.
For advanced configuration such as proxies or self-signed certificates, click here.
This workflow/pipeline will run the Bitbucket (Self-Hosted) integration once and then exit, this is useful for scheduled ingestion of data.
If you want the integration to update Port in real time using webhooks you should use the Real-time (self-hosted) installation option.
- GitHub
- Jenkins
- Azure Devops
- GitLab
Make sure to configure the following Github Secrets:
Parameter | Description | Example | Required |
---|---|---|---|
port_client_id | Your Port client (How to get the credentials) id | ✅ | |
port_client_secret | Your Port client (How to get the credentials) secret | ✅ | |
port_base_url | Your Port API URL - https://api.getport.io for EU, https://api.us.getport.io for US | ✅ | |
config -> bitbucket_username | Bitbucket username | ✅ | |
config -> bitbucket_password | Bitbucket password | ✅ | |
config -> bitbucket_base_url | Bitbucket base url | ✅ | |
config -> bitbucket_webhook_secret | Bitbucket webhook secret used to verify the webhook request | ❌ | |
config -> bitbucket_is_version_8_point_7_or_older | Bitbucket is version 8.7 or older | ❌ | |
initialize_port_resources | Default true, When set to true the integration will create default blueprints and the port App config Mapping. Read more about initializePortResources | ❌ | |
identifier | The identifier of the integration that will be installed | ❌ | |
version | The version of the integration that will be installed | latest | ❌ |
sendRawDataExamples | Enable sending raw data examples from the third party API to port for testing and managing the integration mapping. Default is true | true | |
baseUrl | The host of the Port Ocean app. Used to set up the integration endpoint as the target for webhooks created in Bitbucket (Self-Hosted) | https://my-ocean-integration.com | ❌ |
The following example uses the Ocean Sail Github Action to run the Bitbucket (Self-Hosted) integration. For further information about the action, please visit the Ocean Sail Github Action
Here is an example for bitbucket-server-integration.yml
workflow file:
name: Bitbucket (Self-Hosted) Exporter Workflow
on:
workflow_dispatch:
schedule:
- cron: '0 */1 * * *' # Determines the scheduled interval for this workflow. This example runs every hour.
jobs:
run-integration:
runs-on: ubuntu-latest
timeout-minutes: 30 # Set a time limit for the job
steps:
- uses: port-labs/ocean-sail@v1
with:
type: 'bitbucket-server'
port_client_id: ${{ secrets.OCEAN__PORT__CLIENT_ID }}
port_client_secret: ${{ secrets.OCEAN__PORT__CLIENT_SECRET }}
port_base_url: https://api.getport.io
config: |
bitbucket_username: ${{ secrets.OCEAN__INTEGRATION__CONFIG__BITBUCKET_USERNAME }}
bitbucket_password: ${{ secrets.OCEAN__INTEGRATION__CONFIG__BITBUCKET_PASSWORD }}
bitbucket_base_url: ${{ secrets.OCEAN__INTEGRATION__CONFIG__BITBUCKET_BASE_URL }}
bitbucket_webhook_secret: ${{ secrets.OCEAN__INTEGRATION__CONFIG__BITBUCKET_WEBHOOK_SECRET }}
bitbucket_is_version_8_point_7_or_older: ${{ secrets.OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER }}
Note: You should set the bitbucket_is_version_8_point_7_or_older
parameter to true
if you are using Bitbucket (Self-Hosted) version 8.7 or older. This is because webhook events are setup differently in Bitbucket (Self-Hosted) 8.7 and above.
Your Jenkins agent should be able to run docker commands.
Make sure to configure the following Jenkins Credentials of Secret Text
type:
Parameter | Description | Example | Required |
---|---|---|---|
OCEAN__INTEGRATION__CONFIG__BITBUCKET_USERNAME | Bitbucket Server username | ✅ | |
OCEAN__INTEGRATION__CONFIG__BITBUCKET_PASSWORD | Bitbucket Server password | ✅ | |
OCEAN__INTEGRATION__CONFIG__BITBUCKET_BASE_URL | Bitbucket Server base url | ✅ | |
OCEAN__INTEGRATION__CONFIG__BITBUCKET_WEBHOOK_SECRET | Bitbucket Server webhook secret used to verify the webhook request | ❌ | |
OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER | Bitbucket Server is version 8.7 or older | ❌ | |
OCEAN__PORT__CLIENT_ID | Your Port client (How to get the credentials) id | ✅ | |
OCEAN__PORT__CLIENT_SECRET | Your Port client (How to get the credentials) secret | ✅ | |
OCEAN__PORT__BASE_URL | Your Port API URL - https://api.getport.io for EU, https://api.us.getport.io for US | ✅ | |
OCEAN__INITIALIZE_PORT_RESOURCES | Default true, When set to true the integration will create default blueprints and the port App config Mapping. Read more about initializePortResources | ❌ | |
OCEAN__INTEGRATION__IDENTIFIER | The identifier of the integration that will be installed | ❌ | |
OCEAN__SEND_RAW_DATA_EXAMPLES | Enable sending raw data examples from the third party API to port for testing and managing the integration mapping. Default is true | ❌ | |
OCEAN__BASE_URL | The host of the Port Ocean app. Used to set up the integration endpoint as the target for webhooks created in Linear | ❌ |
Here is an example for Jenkinsfile
groovy pipeline file:
pipeline {
agent any
stages {
stage('Run Bitbucket Self-Hosted Integration') {
steps {
script {
withCredentials([
string(credentialsId: 'OCEAN__INTEGRATION__CONFIG__BITBUCKET_USERNAME', variable: 'OCEAN__INTEGRATION__CONFIG__BITBUCKET_USERNAME'),
string(credentialsId: 'OCEAN__INTEGRATION__CONFIG__BITBUCKET_PASSWORD', variable: 'OCEAN__INTEGRATION__CONFIG__BITBUCKET_PASSWORD'),
string(credentialsId: 'OCEAN__INTEGRATION__CONFIG__BITBUCKET_BASE_URL', variable: 'OCEAN__INTEGRATION__CONFIG__BITBUCKET_BASE_URL'),
string(credentialsId: 'OCEAN__INTEGRATION__CONFIG__BITBUCKET_WEBHOOK_SECRET', variable: 'OCEAN__INTEGRATION__CONFIG__BITBUCKET_WEBHOOK_SECRET'),
string(credentialsId: 'OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER', variable: 'OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER'),
string(credentialsId: 'OCEAN__PORT__CLIENT_ID', variable: 'OCEAN__PORT__CLIENT_ID'),
string(credentialsId: 'OCEAN__PORT__CLIENT_SECRET', variable: 'OCEAN__PORT__CLIENT_SECRET'),
]) {
sh('''
#Set Docker image and run the container
integration_type="bitbucket-server"
version="latest"
image_name="ghcr.io/port-labs/port-ocean-${integration_type}:${version}"
docker run -i --rm --platform=linux/amd64 \
-e OCEAN__EVENT_LISTENER='{"type":"ONCE"}' \
-e OCEAN__INITIALIZE_PORT_RESOURCES=true \
-e OCEAN__SEND_RAW_DATA_EXAMPLES=true \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_USERNAME=$OCEAN__INTEGRATION__CONFIG__BITBUCKET_USERNAME \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_PASSWORD=$OCEAN__INTEGRATION__CONFIG__BITBUCKET_PASSWORD \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_BASE_URL=$OCEAN__INTEGRATION__CONFIG__BITBUCKET_BASE_URL \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_WEBHOOK_SECRET=$OCEAN__INTEGRATION__CONFIG__BITBUCKET_WEBHOOK_SECRET \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER=$OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER \
-e OCEAN__PORT__CLIENT_ID=$OCEAN__PORT__CLIENT_ID \
-e OCEAN__PORT__CLIENT_SECRET=$OCEAN__PORT__CLIENT_SECRET \
-e OCEAN__PORT__BASE_URL='https://api.getport.io' \
$image_name
exit $?
''')
}
}
}
}
}
}
Note: You should set the OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER
parameter to true
if you are using Bitbucket (Self-Hosted) version 8.7 or older. This is because webhook events are setup differently in Bitbucket (Self-Hosted) 8.7 and above.
Your Azure Devops agent should be able to run docker commands. Learn more about agents here.
Variable groups store values and secrets you'll use in your pipelines across your project. Learn more
Setting Up Your Credentials
- Create a Variable Group: Name it port-ocean-credentials.
- Store the required variables (see the table below).
- Authorize Your Pipeline:
- Go to "Library" -> "Variable groups."
- Find port-ocean-credentials and click on it.
- Select "Pipeline Permissions" and add your pipeline to the authorized list.

Parameter | Description | Example | Required |
---|---|---|---|
OCEAN__INTEGRATION__CONFIG__BITBUCKET_USERNAME | Bitbucket Server username | ✅ | |
OCEAN__INTEGRATION__CONFIG__BITBUCKET_PASSWORD | Bitbucket Server password | ✅ | |
OCEAN__INTEGRATION__CONFIG__BITBUCKET_BASE_URL | Bitbucket Server base url | ✅ | |
OCEAN__INTEGRATION__CONFIG__BITBUCKET_WEBHOOK_SECRET | Bitbucket Server webhook secret used to verify the webhook request | ❌ | |
OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER | Bitbucket Server is version 8.7 or older | ❌ | |
OCEAN__PORT__CLIENT_ID | Your Port client (How to get the credentials) id | ✅ | |
OCEAN__PORT__CLIENT_SECRET | Your Port client (How to get the credentials) secret | ✅ | |
OCEAN__PORT__BASE_URL | Your Port API URL - https://api.getport.io for EU, https://api.us.getport.io for US | ✅ | |
OCEAN__INITIALIZE_PORT_RESOURCES | Default true, When set to true the integration will create default blueprints and the port App config Mapping. Read more about initializePortResources | ❌ | |
OCEAN__INTEGRATION__IDENTIFIER | The identifier of the integration that will be installed | ❌ | |
OCEAN__SEND_RAW_DATA_EXAMPLES | Enable sending raw data examples from the third party API to port for testing and managing the integration mapping. Default is true | ❌ | |
OCEAN__BASE_URL | The host of the Port Ocean app. Used to set up the integration endpoint as the target for webhooks created in Linear | ❌ |
Here is an example for bitbucket-server-integration.yml
pipeline file:
trigger:
- main
pool:
vmImage: "ubuntu-latest"
variables:
- group: port-ocean-credentials
steps:
- script: |
# Set Docker image and run the container
integration_type="bitbucket-server"
version="latest"
image_name="ghcr.io/port-labs/port-ocean-$integration_type:$version"
docker run -i --rm \
-e OCEAN__EVENT_LISTENER='{"type":"ONCE"}' \
-e OCEAN__INITIALIZE_PORT_RESOURCES=true \
-e OCEAN__SEND_RAW_DATA_EXAMPLES=true \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_USERNAME=$(OCEAN__INTEGRATION__CONFIG__BITBUCKET_USERNAME) \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_PASSWORD=$(OCEAN__INTEGRATION__CONFIG__BITBUCKET_PASSWORD) \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_BASE_URL=$(OCEAN__INTEGRATION__CONFIG__BITBUCKET_BASE_URL) \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_WEBHOOK_SECRET=$(OCEAN__INTEGRATION__CONFIG__BITBUCKET_WEBHOOK_SECRET) \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER=$(OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER) \
-e OCEAN__PORT__CLIENT_ID=$(OCEAN__PORT__CLIENT_ID) \
-e OCEAN__PORT__CLIENT_SECRET=$(OCEAN__PORT__CLIENT_SECRET) \
-e OCEAN__PORT__BASE_URL='https://api.getport.io' \
$image_name
exit $?
displayName: "Ingest Data into Port"
Note: You should set the OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER
parameter to true
if you are using Bitbucket (Self-Hosted) version 8.7 or older. This is because webhook events are setup differently in Bitbucket (Self-Hosted) 8.7 and above.
Make sure to configure the following GitLab variables:
Parameter | Description | Example | Required |
---|---|---|---|
OCEAN__INTEGRATION__CONFIG__BITBUCKET_USERNAME | Bitbucket Server username | ✅ | |
OCEAN__INTEGRATION__CONFIG__BITBUCKET_PASSWORD | Bitbucket Server password | ✅ | |
OCEAN__INTEGRATION__CONFIG__BITBUCKET_BASE_URL | Bitbucket Server base url | ✅ | |
OCEAN__INTEGRATION__CONFIG__BITBUCKET_WEBHOOK_SECRET | Bitbucket Server webhook secret used to verify the webhook request | ❌ | |
OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER | Bitbucket Server is version 8.7 or older | ❌ | |
OCEAN__PORT__CLIENT_ID | Your Port client (How to get the credentials) id | ✅ | |
OCEAN__PORT__CLIENT_SECRET | Your Port client (How to get the credentials) secret | ✅ | |
OCEAN__PORT__BASE_URL | Your Port API URL - https://api.getport.io for EU, https://api.us.getport.io for US | ✅ | |
OCEAN__INITIALIZE_PORT_RESOURCES | Default true, When set to true the integration will create default blueprints and the port App config Mapping. Read more about initializePortResources | ❌ | |
OCEAN__INTEGRATION__IDENTIFIER | The identifier of the integration that will be installed | ❌ | |
OCEAN__SEND_RAW_DATA_EXAMPLES | Enable sending raw data examples from the third party API to port for testing and managing the integration mapping. Default is true | ❌ | |
OCEAN__BASE_URL | The host of the Port Ocean app. Used to set up the integration endpoint as the target for webhooks created in Linear | ❌ |
Here is an example for .gitlab-ci.yml
pipeline file:
default:
image: docker:24.0.5
services:
- docker:24.0.5-dind
before_script:
- docker info
variables:
INTEGRATION_TYPE: bitbucket-server
VERSION: latest
stages:
- ingest
ingest_data:
stage: ingest
variables:
IMAGE_NAME: ghcr.io/port-labs/port-ocean-$INTEGRATION_TYPE:$VERSION
script:
- |
docker run -i --rm --platform=linux/amd64 \
-e OCEAN__EVENT_LISTENER='{"type":"ONCE"}' \
-e OCEAN__INITIALIZE_PORT_RESOURCES=true \
-e OCEAN__SEND_RAW_DATA_EXAMPLES=true \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_USERNAME=$OCEAN__INTEGRATION__CONFIG__BITBUCKET_USERNAME \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_PASSWORD=$OCEAN__INTEGRATION__CONFIG__BITBUCKET_PASSWORD \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_BASE_URL=$OCEAN__INTEGRATION__CONFIG__BITBUCKET_BASE_URL \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_WEBHOOK_SECRET=$OCEAN__INTEGRATION__CONFIG__BITBUCKET_WEBHOOK_SECRET \
-e OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER=$OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER \
-e OCEAN__PORT__CLIENT_ID=$OCEAN__PORT__CLIENT_ID \
-e OCEAN__PORT__CLIENT_SECRET=$OCEAN__PORT__CLIENT_SECRET \
-e OCEAN__PORT__BASE_URL='https://api.getport.io' \
$IMAGE_NAME
rules: # Run only when changes are made to the main branch
- if: '$CI_COMMIT_BRANCH == "main"'
Note: You should set the OCEAN__INTEGRATION__CONFIG__BITBUCKET_IS_VERSION_8_POINT_7_OR_OLDER
parameter to true
if you are using Bitbucket (Self-Hosted) version 8.7 or older. This is because webhook events are setup differently in Bitbucket (Self-Hosted) 8.7 and above.
The port_region
, port.baseUrl
, portBaseUrl
, port_base_url
and OCEAN__PORT__BASE_URL
parameters are used to select which instance or Port API will be used.
Port exposes two API instances, one for the EU region of Port, and one for the US region of Port.
- If you use the EU region of Port (https://app.port.io), your API URL is
https://api.port.io
. - If you use the US region of Port (https://app.us.port.io), your API URL is
https://api.us.port.io
.
For advanced configuration such as proxies or self-signed certificates, click here.
Configuration
Port integrations use a YAML mapping block to ingest data from the third-party api into Port.
The mapping makes use of the JQ JSON processor to select, modify, concatenate, transform and perform other operations on existing fields and values from the integration API.
Examples
To view and test the integration's mapping against examples of the third-party API responses, use the jq playground in your data sources page. Find the integration in the list of data sources and click on it to open the playground.
Additional examples of blueprints and the relevant integration configurations:
Project
Project blueprint
{
"identifier": "bitbucketProject",
"description": "A software catalog to represent Bitbucket project",
"title": "Bitbucket Project",
"icon": "BitBucket",
"schema": {
"properties": {
"public": {
"icon": "DefaultProperty",
"title": "Public",
"type": "boolean"
},
"description": {
"title": "Description",
"type": "string",
"icon": "DefaultProperty"
},
"type": {
"icon": "DefaultProperty",
"title": "Type",
"type": "string"
},
"link": {
"title": "Link",
"icon": "DefaultProperty",
"type": "string"
}
},
"required": []
},
"mirrorProperties": {},
"calculationProperties": {},
"relations": {}
}
Integration configuration
createMissingRelatedEntities: true
deleteDependentEntities: true
resources:
- kind: project
selector:
query: "true"
port:
entity:
mappings:
identifier: .key
title: .name
blueprint: '"bitbucketProject"'
properties:
public: .public
type: .type
description: .description
link: .links.self[0].href
Repository
Repository blueprint
{
"identifier": "bitbucketRepository",
"description": "A software catalog to represent Bitbucket repositories",
"title": "Bitbucket Repository",
"icon": "BitBucket",
"schema": {
"properties": {
"forkable": {
"icon": "DefaultProperty",
"title": "Is Forkable",
"type": "boolean"
},
"description": {
"title": "Description",
"type": "string",
"icon": "DefaultProperty"
},
"public": {
"icon": "DefaultProperty",
"title": "Is Public",
"type": "boolean"
},
"state": {
"icon": "DefaultProperty",
"title": "State",
"type": "string"
},
"link": {
"title": "Link",
"icon": "DefaultProperty",
"type": "string"
},
"documentation": {
"icon": "DefaultProperty",
"title": "Documentation",
"type": "string",
"format": "markdown"
},
"swagger_url": {
"title": "Swagger URL",
"type": "string",
"format": "url",
"spec": "async-api",
"icon": "DefaultProperty"
},
"readme": {
"title": "Readme",
"type": "string",
"format": "markdown",
"icon": "DefaultProperty"
}
},
"required": []
},
"mirrorProperties": {},
"calculationProperties": {},
"aggregationProperties": {},
"relations": {
"latestCommitAuthor": {
"title": "Latest Commit By",
"description": "The user that made the most recent commit to the base branch",
"target": "bitbucketUser",
"required": false,
"many": false
},
"project": {
"title": "Project",
"target": "bitbucketProject",
"required": false,
"many": false
}
}
}
Integration configuration
createMissingRelatedEntities: true
deleteDependentEntities: true
resources:
- kind: repository
selector:
query: "true"
port:
entity:
mappings:
identifier: .slug
title: .name
blueprint: '"bitbucketRepository"'
properties:
description: .description
state: .state
forkable: .forkable
public: .public
link: .links.self[0].href
documentation: .__readme
relations:
project: .project.key
latestCommitAuthor: .__latestCommit.author.emailAddress
Pull Request
Pull Request blueprint
{
"identifier": "bitbucketPullRequest",
"description": "A software catalog to represent Bitbucket pull requests",
"title": "Bitbucket Pull Request",
"icon": "BitBucket",
"schema": {
"properties": {
"created_on": {
"title": "Created On",
"type": "string",
"format": "date-time",
"icon": "DefaultProperty"
},
"updated_on": {
"title": "Updated On",
"type": "string",
"format": "date-time",
"icon": "DefaultProperty"
},
"description": {
"title": "Description",
"type": "string",
"icon": "DefaultProperty"
},
"state": {
"icon": "DefaultProperty",
"title": "State",
"type": "string",
"enum": [
"OPEN",
"MERGED",
"DECLINED",
"SUPERSEDED"
],
"enumColors": {
"OPEN": "yellow",
"MERGED": "green",
"DECLINED": "red",
"SUPERSEDED": "purple"
}
},
"owner": {
"title": "Owner",
"type": "string",
"icon": "DefaultProperty"
},
"link": {
"title": "Link",
"icon": "DefaultProperty",
"type": "string"
},
"destination": {
"title": "Destination Branch",
"type": "string",
"icon": "DefaultProperty"
},
"source": {
"title": "Source Branch",
"type": "string",
"icon": "DefaultProperty"
},
"reviewers": {
"items": {
"type": "string"
},
"title": "Reviewers",
"type": "array",
"icon": "DefaultProperty"
},
"merge_commit": {
"title": "Merge Commit",
"type": "string",
"icon": "DefaultProperty"
},
"mergedAt": {
"title": "Merged At",
"type": "string",
"format": "date-time",
"icon": "DefaultProperty"
}
},
"required": []
},
"mirrorProperties": {},
"calculationProperties": {},
"aggregationProperties": {},
"relations": {
"participants": {
"title": "Participants",
"description": "Users that contributed to the PR",
"target": "bitbucketUser",
"required": false,
"many": true
},
"repository": {
"title": "Repository",
"target": "bitbucketRepository",
"required": false,
"many": false
}
}
}
Integration configuration
createMissingRelatedEntities: true
deleteDependentEntities: true
resources:
- kind: pull-request
selector:
query: "true"
port:
entity:
mappings:
identifier: .id | tostring
title: .title
blueprint: '"bitbucketPullRequest"'
properties:
created_on: .createdDate | (tonumber / 1000 | strftime("%Y-%m-%dT%H:%M:%SZ"))
updated_on: .updatedDate | (tonumber / 1000 | strftime("%Y-%m-%dT%H:%M:%SZ"))
merge_commit: .fromRef.latestCommit
state: .state
owner: .author.user.emailAddress
link: .links.self[0].href
destination: .toRef.displayId
source: .fromRef.displayId
mergedAt: .closedDate as $d | if $d == null then null else ($d / 1000 | strftime("%Y-%m-%dT%H:%M:%SZ")) end
reviewers: "[.reviewers[].user.emailAddress]"
relations:
repository: .toRef.repository.slug
participants: "[.participants[].user.emailAddress]"
User
User blueprint
{
"identifier": "bitbucketUser",
"description": "A software catalog to represent Bitbucket users",
"title": "Bitbucket User",
"icon": "BitBucket",
"schema": {
"properties": {
"username": {
"type": "string",
"title": "Username",
"description": "The username of the user"
},
"url": {
"title": "URL",
"description": "The link to the user profile",
"icon": "BitBucket",
"type": "string"
},
"portUser": {
"title": "Port User",
"type": "string",
"icon": "DefaultProperty",
"format": "user"
}
},
"required": [
"username"
]
},
"mirrorProperties": {},
"calculationProperties": {},
"aggregationProperties": {},
"relations": {}
}
Integration configuration
createMissingRelatedEntities: true
deleteDependentEntities: true
resources:
- kind: user
selector:
query: "true"
port:
entity:
mappings:
identifier: .emailAddress
title: .displayName
blueprint: '"bitbucketUser"'
properties:
username: .name
url: .links.self[0].href
portUser: .emailAddress
Let's test it
This section includes sample response data from Bitbucket (Self-Hosted). In addition, it includes the entity created from the resync event based on the Ocean configuration provided in the previous section.
Payload
Here is an example of the payload structure from Bitbucket (Self-Hosted):
Project response data
{
"key": "PROJ",
"id": 1,
"name": "Project",
"public": false,
"type": "NORMAL",
"links": {
"self": [
{
"href": "http://localhost:7990/projects/PROJ"
}
]
}
}
Repository response data
{
"slug": "repostiroy-3",
"id": 3,
"name": "Repostiroy-3",
"hierarchyId": "0a8dadb07bb606236d8c",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"project": {
"key": "PRO3",
"id": 3,
"name": "Project Three",
"public": false,
"type": "NORMAL",
"links": {
"self": [
{
"href": "http://localhost:7990/projects/PRO3"
}
]
}
},
"public": false,
"archived": false,
"links": {
"clone": [
{
"href": "ssh://git@localhost:7999/pro3/repostiroy-3.git",
"name": "ssh"
},
{
"href": "http://localhost:7990/scm/pro3/repostiroy-3.git",
"name": "http"
}
],
"self": [
{
"href": "http://localhost:7990/projects/PRO3/repos/repostiroy-3/browse"
}
]
},
"__readme": "",
"__latestCommit": {
"id": "965068d1461f119139bb6be582bb22a555a195ba",
"displayId": "965068d1461",
"author": {
"name": "[REDACTED]",
"emailAddress": "admin@gmail.com",
"active": true,
"displayName": "Admin",
"id": 3,
"slug": "[REDACTED]",
"type": "NORMAL",
"links": {
"self": [
{
"href": "http://localhost:7990/users/[REDACTED]"
}
]
}
},
"authorTimestamp": 1747744649000,
"committer": {
"name": "[REDACTED]",
"emailAddress": "admin@gmail.com",
"active": true,
"displayName": "Admin",
"id": 3,
"slug": "[REDACTED]",
"type": "NORMAL",
"links": {
"self": [
{
"href": "http://localhost:7990/users/[REDACTED]"
}
]
}
},
"committerTimestamp": 1747744649000,
"message": "Pull request #1: readme.md edited online with Bitbucket\n\nMerge in PRO3/repostiroy-3 from main to master\n\n* commit '3e4df0573a0ba1845ebdfa919c907745497313aa':\n readme.md edited online with Bitbucket",
"parents": [
{
"id": "9534663d88977c0aa5c25249986eae851fd83a8d",
"displayId": "9534663d889"
},
{
"id": "3e4df0573a0ba1845ebdfa919c907745497313aa",
"displayId": "3e4df0573a0"
}
]
}
}
Pull Request response data
{
"id": 1,
"version": 1,
"title": "readme.md edited online with Bitbucket",
"state": "OPEN",
"open": true,
"closed": false,
"draft": false,
"createdDate": 1747730324792,
"updatedDate": 1747730324792,
"fromRef": {
"id": "refs/heads/main",
"displayId": "main",
"latestCommit": "3e4df0573a0ba1845ebdfa919c907745497313aa",
"type": "BRANCH",
"repository": {
"slug": "repostiroy-3",
"id": 3,
"name": "Repostiroy 3",
"hierarchyId": "0a8dadb07bb606236d8c",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"project": {
"key": "PRO3",
"id": 3,
"name": "Project Three",
"public": false,
"type": "NORMAL",
"links": {
"self": [
{
"href": "http://localhost:7990/projects/PRO3"
}
]
}
},
"public": false,
"archived": false,
"links": {
"clone": [
{
"href": "ssh://git@localhost:7999/pro3/repostiroy-3.git",
"name": "ssh"
},
{
"href": "http://localhost:7990/scm/pro3/repostiroy-3.git",
"name": "http"
}
],
"self": [
{
"href": "http://localhost:7990/projects/PRO3/repos/repostiroy-3/browse"
}
]
}
}
},
"toRef": {
"id": "refs/heads/master",
"displayId": "master",
"latestCommit": "9534663d88977c0aa5c25249986eae851fd83a8d",
"type": "BRANCH",
"repository": {
"slug": "repostiroy-3",
"id": 3,
"name": "Repostiroy 3",
"hierarchyId": "0a8dadb07bb606236d8c",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"project": {
"key": "PRO3",
"id": 3,
"name": "Project Three",
"public": false,
"type": "NORMAL",
"links": {
"self": [
{
"href": "http://localhost:7990/projects/PRO3"
}
]
}
},
"public": false,
"archived": false,
"links": {
"clone": [
{
"href": "ssh://git@localhost:7999/pro3/repostiroy-3.git",
"name": "ssh"
},
{
"href": "http://localhost:7990/scm/pro3/repostiroy-3.git",
"name": "http"
}
],
"self": [
{
"href": "http://localhost:7990/projects/PRO3/repos/repostiroy-3/browse"
}
]
}
}
},
"locked": false,
"author": {
"user": {
"name": "[REDACTED]",
"emailAddress": "admin@gmail.com",
"active": true,
"displayName": "Admin",
"id": 3,
"slug": "[REDACTED]",
"type": "NORMAL",
"links": {
"self": [
{
"href": "http://localhost:7990/users/[REDACTED]"
}
]
}
},
"role": "AUTHOR",
"approved": false,
"status": "UNAPPROVED"
},
"reviewers": [],
"participants": [],
"properties": {
"mergeResult": {
"outcome": "CLEAN",
"current": false
},
"resolvedTaskCount": 0,
"commentCount": 0,
"openTaskCount": 0
},
"links": {
"self": [
{
"href": "http://localhost:7990/projects/PRO3/repos/repostiroy-3/pull-requests/1"
}
]
}
}
User response data
{
"name": "[REDACTED]",
"emailAddress": "admin@gmail.com",
"active": true,
"displayName": "Admin",
"id": 3,
"slug": "[REDACTED]",
"type": "NORMAL",
"links": {
"self": [
{
"href": "http://localhost:7990/users/[REDACTED]"
}
]
}
}
Mapping result
The combination of the sample payload and the Ocean configuration generates the following Port entity:
Project entity in Port
{
"blueprint": "bitbucketProject",
"identifier": "PROJ",
"createdAt": "2025-05-20T09:14:22.361Z",
"updatedBy": "jqoQ34Azuy08BJFFUZAKyP3sXranvmgc",
"createdBy": "jqoQ34Azuy08BJFFUZAKyP3sXranvmgc",
"team": [],
"title": "Project",
"relations": {},
"properties": {
"public": false,
"link": "http://localhost:7990/projects/PROJ",
"description": null,
"type": "NORMAL"
},
"updatedAt": "2025-05-22T20:46:27.616Z"
}
Repository entity in Port
{
"blueprint": "bitbucketRepository",
"identifier": "repostiroy-3",
"createdAt": "2025-05-20T13:14:09.505Z",
"updatedBy": "jqoQ34Azuy08BJFFUZAKyP3sXranvmgc",
"createdBy": "jqoQ34Azuy08BJFFUZAKyP3sXranvmgc",
"team": [],
"title": "Repostiroy-3",
"relations": {
"project": "PRO3",
"latestCommitAuthor": "admin@gmail.com"
},
"properties": {
"public": false,
"documentation": "",
"link": "http://localhost:7990/projects/PRO3/repos/repostiroy-3/browse",
"forkable": true,
"description": null,
"state": "AVAILABLE",
"readme": null,
"swagger_url": null
},
"updatedAt": "2025-05-20T13:14:09.505Z"
}
Pull Request entity in Port
{
"blueprint": "bitbucketPullRequest",
"identifier": "1",
"createdAt": "2025-05-20T09:22:29.565Z",
"updatedBy": "jqoQ34Azuy08BJFFUZAKyP3sXranvmgc",
"createdBy": "jqoQ34Azuy08BJFFUZAKyP3sXranvmgc",
"team": [],
"title": "readme.md edited online with Bitbucket",
"relations": {
"repository": "repostiroy-3",
"participants": []
},
"properties": {
"updated_on": "2025-05-20T12:37:29Z",
"owner": "admin@gmail.com",
"created_on": "2025-05-20T08:38:44Z",
"mergedAt": "2025-05-20T12:37:29Z",
"link": "http://localhost:7990/projects/PRO3/repos/repostiroy-3/pull-requests/1",
"destination": "master",
"description": null,
"state": "MERGED",
"source": "main",
"reviewers": [],
"merge_commit": "3e4df0573a0ba1845ebdfa919c907745497313aa"
},
"updatedAt": "2025-05-20T12:37:34.111Z"
}
User entity in Port
{
"blueprint": "bitbucketUser",
"identifier": "admin@gmail.com",
"createdAt": "2025-05-20T09:10:04.195Z",
"updatedBy": "jqoQ34Azuy08BJFFUZAKyP3sXranvmgc",
"createdBy": "jqoQ34Azuy08BJFFUZAKyP3sXranvmgc",
"team": [],
"title": "Admin",
"relations": {},
"properties": {
"portUser": "admin@gmail.com",
"url": "http://localhost:7990/users/admin",
"username": "admin"
},
"updatedAt": "2025-05-20T09:10:04.195Z"
}
GitOps
The Bitbucket (Self-Hosted) Ocean integration by itself does not support GitOps yet. This capability is planned for a future release and is WIP. If you really need GitOps support, you can use the webhook gitops installation method.
Alternative installation via webhooks
While the Ocean integration described above is the recommended installation method, you may prefer to use a webhook to ingest users, projects, repositories and pull request entities from Bitbucket (Self-Hosted). If so, use the following instructions:
Note that when using this method, data will be ingested into Port only when the webhook is triggered.
Webhook installation (click to expand)
Port configuration
Create the following blueprint definitions:Bitbucket project blueprint
{
"identifier": "bitbucketProject",
"description": "A software catalog to represent Bitbucket project",
"title": "Bitbucket Project",
"icon": "BitBucket",
"schema": {
"properties": {
"public": {
"icon": "DefaultProperty",
"title": "Public",
"type": "boolean"
},
"description": {
"title": "Description",
"type": "string",
"icon": "DefaultProperty"
},
"type": {
"icon": "DefaultProperty",
"title": "Type",
"type": "string"
},
"link": {
"title": "Link",
"icon": "DefaultProperty",
"type": "string"
}
},
"required": []
},
"mirrorProperties": {},
"calculationProperties": {},
"relations": {}
}Bitbucket user blueprint
{
"identifier": "bitbucketUser",
"description": "A software catalog to represent Bitbucket users",
"title": "Bitbucket User",
"icon": "BitBucket",
"schema": {
"properties": {
"username": {
"type": "string",
"title": "Username",
"description": "The username of the user"
},
"url": {
"title": "URL",
"description": "The link to the user profile",
"icon": "BitBucket",
"type": "string"
}
},
"required": [
"username"
]
},
"mirrorProperties": {},
"calculationProperties": {},
"aggregationProperties": {},
"relations": {}
}Bitbucket repository blueprint
{
"identifier": "bitbucketRepository",
"description": "A software catalog to represent Bitbucket repositories",
"title": "Bitbucket Repository",
"icon": "BitBucket",
"schema": {
"properties": {
"forkable": {
"icon": "DefaultProperty",
"title": "Is Forkable",
"type": "boolean"
},
"description": {
"title": "Description",
"type": "string",
"icon": "DefaultProperty"
},
"public": {
"icon": "DefaultProperty",
"title": "Is Public",
"type": "boolean"
},
"state": {
"icon": "DefaultProperty",
"title": "State",
"type": "string"
},
"link": {
"title": "Link",
"icon": "DefaultProperty",
"type": "string"
},
"documentation": {
"icon": "DefaultProperty",
"title": "Documentation",
"type": "string",
"format": "markdown"
},
"swagger_url": {
"title": "Swagger URL",
"type": "string",
"format": "url",
"spec": "async-api",
"icon": "DefaultProperty"
}
},
"required": []
},
"mirrorProperties": {},
"calculationProperties": {},
"aggregationProperties": {},
"relations": {
"latestCommitAuthor": {
"title": "Latest Commit By",
"description": "The user that made the most recent commit to the base branch",
"target": "bitbucketUser",
"required": false,
"many": false
},
"project": {
"title": "Project",
"target": "bitbucketProject",
"required": false,
"many": false
}
}
}Bitbucket pull request blueprint
{
"identifier": "bitbucketPullrequest",
"description": "A software catalog to represent Bitbucket pull requests",
"title": "Bitbucket Pull Request",
"icon": "BitBucket",
"schema": {
"properties": {
"created_on": {
"title": "Created On",
"type": "string",
"format": "date-time",
"icon": "DefaultProperty"
},
"updated_on": {
"title": "Updated On",
"type": "string",
"format": "date-time",
"icon": "DefaultProperty"
},
"description": {
"title": "Description",
"type": "string",
"icon": "DefaultProperty"
},
"state": {
"icon": "DefaultProperty",
"title": "State",
"type": "string",
"enum": [
"OPEN",
"MERGED",
"DECLINED",
"SUPERSEDED"
],
"enumColors": {
"OPEN": "yellow",
"MERGED": "green",
"DECLINED": "red",
"SUPERSEDED": "purple"
}
},
"owner": {
"title": "Owner",
"type": "string",
"icon": "DefaultProperty"
},
"link": {
"title": "Link",
"icon": "DefaultProperty",
"type": "string"
},
"destination": {
"title": "Destination Branch",
"type": "string",
"icon": "DefaultProperty"
},
"source": {
"title": "Source Branch",
"type": "string",
"icon": "DefaultProperty"
},
"reviewers": {
"items": {
"type": "string"
},
"title": "Reviewers",
"type": "array",
"icon": "DefaultProperty"
},
"merge_commit": {
"title": "Merge Commit",
"type": "string",
"icon": "DefaultProperty"
},
"mergedAt": {
"title": "Merged At",
"type": "string",
"format": "date-time",
"icon": "DefaultProperty"
}
},
"required": []
},
"mirrorProperties": {},
"calculationProperties": {},
"aggregationProperties": {},
"relations": {
"participants": {
"title": "Participants",
"description": "Users that contributed to the PR",
"target": "bitbucketUser",
"required": false,
"many": true
},
"repository": {
"title": "Repository",
"target": "bitbucketRepository",
"required": false,
"many": false
}
}
}
You may modify the properties in your blueprints depending on what you want to track in your Bitbucket account.
Let's test it
This section includes a sample webhook event sent from Bitbucket when a pull request is merged. In addition, it includes the entity created from the event based on the webhook configuration provided in the previous section.
Payload
Here is an example of the payload structure sent to the webhook URL when a Bitbucket pull request is merged: Webhook event payload
{
"body": {
"eventKey": "pr:merged",
"date": "2023-11-16T11:03:42+0000",
"actor": {
"name": "admin",
"emailAddress": "username@gmail.com",
"active": true,
"displayName": "Test User",
"id": 2,
"slug": "admin",
"type": "NORMAL",
"links": {
"self": [
{
"href": "http://myhost:7990/users/admin"
}
]
}
},
"pullRequest": {
"id": 2,
"version": 2,
"title": "lint code",
"description": "here is the description",
"state": "MERGED",
"open": false,
"closed": true,
"createdDate": 1700132280533,
"updatedDate": 1700132622026,
"closedDate": 1700132622026,
"fromRef": {
"id": "refs/heads/dev",
"displayId": "dev",
"latestCommit": "9e08604e14fa72265d65696608725c2b8f7850f2",
"type": "BRANCH",
"repository": {
"slug": "data-analyses",
"id": 1,
"name": "data analyses",
"description": "This is for my repository and all the blah blah blah",
"hierarchyId": "24cfae4b0dd7bade7edc",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"project": {
"key": "MOPP",
"id": 1,
"name": "My On Prem Project",
"description": "On premise test project is sent to us for us",
"public": false,
"type": "NORMAL",
"links": {
"self": [
{
"href": "http://myhost:7990/projects/MOPP"
}
]
}
},
"public": false,
"archived": false,
"links": {
"clone": [
{
"href": "ssh://git@myhost:7999/mopp/data-analyses.git",
"name": "ssh"
},
{
"href": "http://myhost:7990/scm/mopp/data-analyses.git",
"name": "http"
}
],
"self": [
{
"href": "http://myhost:7990/projects/MOPP/repos/data-analyses/browse"
}
]
}
}
},
"toRef": {
"id": "refs/heads/main",
"displayId": "main",
"latestCommit": "e461aae894b6dc951f405dca027a3f5567ea6bee",
"type": "BRANCH",
"repository": {
"slug": "data-analyses",
"id": 1,
"name": "data analyses",
"description": "This is for my repository and all the blah blah blah",
"hierarchyId": "24cfae4b0dd7bade7edc",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"project": {
"key": "MOPP",
"id": 1,
"name": "My On Prem Project",
"description": "On premise test project is sent to us for us",
"public": false,
"type": "NORMAL",
"links": {
"self": [
{
"href": "http://myhost:7990/projects/MOPP"
}
]
}
},
"public": false,
"archived": false,
"links": {
"clone": [
{
"href": "ssh://git@myhost:7999/mopp/data-analyses.git",
"name": "ssh"
},
{
"href": "http://myhost:7990/scm/mopp/data-analyses.git",
"name": "http"
}
],
"self": [
{
"href": "http://myhost:7990/projects/MOPP/repos/data-analyses/browse"
}
]
}
}
},
"locked": false,
"author": {
"user": {
"name": "admin",
"emailAddress": "username@gmail.com",
"active": true,
"displayName": "Test User",
"id": 2,
"slug": "admin",
"type": "NORMAL",
"links": {
"self": [
{
"href": "http://myhost:7990/users/admin"
}
]
}
},
"role": "AUTHOR",
"approved": false,
"status": "UNAPPROVED"
},
"reviewers": [],
"participants": [],
"properties": {
"mergeCommit": {
"displayId": "1cbccf99220",
"id": "1cbccf99220b23f89624c7c604f630663a1aaf8e"
}
},
"links": {
"self": [
{
"href": "http://myhost:7990/projects/MOPP/repos/data-analyses/pull-requests/2"
}
]
}
}
},
"headers": {
"X-Forwarded-For": "10.0.148.57",
"X-Forwarded-Proto": "https",
"X-Forwarded-Port": "443",
"Host": "ingest.getport.io",
"X-Amzn-Trace-Id": "Self=1-6555f719-267a0fce1e7a4d8815de94f7;Root=1-6555f719-1906872f41621b17250bb83a",
"Content-Length": "2784",
"User-Agent": "Atlassian HttpClient 3.0.4 / Bitbucket-8.15.1 (8015001) / Default",
"Content-Type": "application/json; charset=UTF-8",
"accept": "*/*",
"X-Event-Key": "pr:merged",
"X-Hub-Signature": "sha256=bf366faf8d8c41a4af21d25d922b87c3d1d127b5685238b099d2f311ad46e978",
"X-Request-Id": "d5fa6a16-bb6c-40d6-9c50-bc4363e79632",
"via": "HTTP/1.1 AmazonAPIGateway",
"forwarded": "for=154.160.30.235;host=ingest.getport.io;proto=https"
},
"queryParams": {}
} Mapping result
{
"identifier":"2",
"title":"lint code",
"blueprint":"bitbucketPullrequest",
"properties":{
"created_on":"2023-11-16T10:58:00Z",
"updated_on":"2023-11-16T11:03:42Z",
"merge_commit":"9e08604e14fa72265d65696608725c2b8f7850f2",
"state":"MERGED",
"owner":"Test User",
"link":"http://myhost:7990/projects/MOPP/repos/data-analyses/pull-requests/2",
"destination":"main",
"source":"dev",
"participants":[],
"reviewers":[]
},
"relations":{
"repository":"data-analyses"
},
"filter":true
}
Import Bitbucket historical issues
In this example you are going to use the provided Python script to set up webhooks and fetch data from the Bitbucket Server / Bitbucket Data Center API and ingest it to Port.
Prerequisites
This example utilizes the blueprint definition from the previous section.
In addition, provide the following environment variables:
PORT_CLIENT_ID
- Your Port client idPORT_CLIENT_SECRET
- Your Port client secretBITBUCKET_HOST
- Bitbucket Server / Bitbucket Data Center host such ashttp://localhost:7990
BITBUCKET_USERNAME
- Bitbucket username to use when accessing the Bitbucket resourcesBITBUCKET_PASSWORD
- Bitbucket account passwordBITBUCKET_PROJECTS_FILTER
- An optional comma separated list of Bitbucket projects to filter. If not provided, all projects will be fetched.WEBHOOK_SECRET
- An optional secret to use when creating a webhook in Port. If not provided,bitbucket_webhook_secret
will be used.PORT_API_URL
- An optional variable that defaults to the EU Port APIhttps://api.getport.io/v1
. For US organizations usehttps://api.us.getport.io/v1
instead.IS_VERSION_8_7_OR_OLDER
- An optional variable that specifies whether the Bitbucket version is older than 8.7. This setting determines if webhooks should be created at the repository level (for older versions<=8.7
) or at the project level (for newer versions>=8.8
).PULL_REQUEST_STATE
- An optional variable to specify the state of Bitbucket pull requests to be ingested. Accepted values are"ALL"
,"OPEN"
,"MERGED"
, or"DECLINED"
. If not specified, the default value isOPEN
.
This app will automatically set up a webhook that allows Bitbucket to send events to Port. To understand more about how Bitbucket sends event payloads via webhooks, you can refer to this documentation.
Ensure that the Bitbucket credentials you use have PROJECT_ADMIN
permissions to successfully configure the webhook. For more details on the necessary permissions and setup, see the official Bitbucket documentation.
Find your Port credentials using this guide
Use the following Python script to set up webhook and ingest historical Bitbucket users, projects, repositories and pull requests into port: You can pull the latest version of this code by cloning this repositoryBitbucket Python script
import json
import re
import time
from datetime import datetime, timedelta
import asyncio
from typing import Any, Optional
import httpx
from decouple import config
from loguru import logger
from httpx import BasicAuth
# These are the credentials passed by the variables of your pipeline to your tasks and into your env
PORT_CLIENT_ID = config("PORT_CLIENT_ID")
PORT_CLIENT_SECRET = config("PORT_CLIENT_SECRET")
BITBUCKET_USERNAME = config("BITBUCKET_USERNAME")
BITBUCKET_PASSWORD = config("BITBUCKET_PASSWORD")
BITBUCKET_API_URL = config("BITBUCKET_HOST")
BITBUCKET_PROJECTS_FILTER = config(
"BITBUCKET_PROJECTS_FILTER", cast=lambda v: v.split(",") if v else None, default=[]
)
PORT_API_URL = config("PORT_API_URL", default="https://api.getport.io/v1")
WEBHOOK_SECRET = config("WEBHOOK_SECRET", default="bitbucket_webhook_secret")
IS_VERSION_8_7_OR_OLDER = config("IS_VERSION_8_7_OR_OLDER", default=False)
VALID_PULL_REQUEST_STATES = {"ALL", "OPEN", "MERGED", "DECLINED"}
PULL_REQUEST_STATE = config("PULL_REQUEST_STATE", default="OPEN").upper()
# According to https://support.atlassian.com/bitbucket-cloud/docs/api-request-limits/
RATE_LIMIT = 1000 # Maximum number of requests allowed per hour
RATE_PERIOD = 3600 # Rate limit reset period in seconds (1 hour)
WEBHOOK_IDENTIFIER = "bitbucket_mapper"
WEBHOOK_EVENTS = [
"repo:modified",
"project:modified",
"pr:modified",
"pr:opened",
"pr:merged",
"pr:reviewer:updated",
"pr:declined",
"pr:deleted",
"pr:comment:deleted",
"pr:from_ref_updated",
"pr:comment:edited",
"pr:reviewer:unapproved",
"pr:reviewer:needs_work",
"pr:reviewer:approved",
"pr:comment:added",
]
# Initialize rate limiting variables
request_count = 0
rate_limit_start = time.time()
port_access_token, token_expiry_time = None, datetime.now()
port_headers = {}
bitbucket_auth = BasicAuth(username=BITBUCKET_USERNAME, password=BITBUCKET_PASSWORD)
client = httpx.AsyncClient(timeout=httpx.Timeout(60))
async def get_access_token():
credentials = {"clientId": PORT_CLIENT_ID, "clientSecret": PORT_CLIENT_SECRET}
token_response = await client.post(
f"{PORT_API_URL}/auth/access_token", json=credentials
)
response_data = token_response.json()
access_token = response_data["accessToken"]
expires_in = response_data["expiresIn"]
token_expiry_time = datetime.now() + timedelta(seconds=expires_in)
return access_token, token_expiry_time
async def refresh_access_token():
global port_access_token, token_expiry_time, port_headers
logger.info("Refreshing access token...")
port_access_token, token_expiry_time = await get_access_token()
port_headers = {"Authorization": f"Bearer {port_access_token}"}
logger.info(f"New token received. Expiry time: {token_expiry_time}")
async def refresh_token_if_expired():
if datetime.now() >= token_expiry_time:
await refresh_access_token()
async def refresh_token_and_retry(method: str, url: str, **kwargs):
await refresh_access_token()
response = await client.request(method, url, headers=port_headers, **kwargs)
return response
def sanitize_identifier(identifier: str) -> str:
pattern = r"[^A-Za-z0-9@_.+:\/=-]"
# Replace any character that does not match the pattern with an underscore
return re.sub(pattern, "_", identifier)
async def send_port_request(method: str, endpoint: str, payload: Optional[dict] = None):
global port_access_token, token_expiry_time, port_headers
await refresh_token_if_expired()
url = f"{PORT_API_URL}/{endpoint}"
try:
response = await client.request(method, url, headers=port_headers, json=payload)
response.raise_for_status()
return response
except httpx.HTTPStatusError as e:
if e.response.status_code == 401:
logger.info("Received 401 Unauthorized. Refreshing token and retrying...")
try:
response = await refresh_token_and_retry(method, url, json=payload)
response.raise_for_status()
return response
except httpx.HTTPStatusError as e:
logger.error(
f"Error after retrying: {e.response.status_code}, {e.response.text}"
)
return {"status_code": e.response.status_code, "response": e.response}
else:
logger.error(
f"HTTP error occurred: {e.response.status_code}, {e.response.text}"
)
return {"status_code": e.response.status_code, "response": e.response}
except httpx.HTTPError as e:
logger.error(f"HTTP error occurred: {e}")
return {"status_code": None, "error": e}
async def get_or_create_port_webhook():
logger.info("Checking if a Bitbucket webhook is configured on Port...")
response = await send_port_request(
method="GET", endpoint=f"webhooks/{WEBHOOK_IDENTIFIER}"
)
if isinstance(response, dict):
if response.get("status_code") == 404:
logger.info("Port webhook not found, creating a new one.")
return await create_port_webhook()
else:
return None
else:
webhook_url = response.json().get("integration", {}).get("url")
logger.info(f"Webhook configuration exists in Port. URL: {webhook_url}")
return webhook_url
async def create_port_webhook():
logger.info("Creating a webhook for Bitbucket on Port...")
with open("./resources/webhook_configuration.json", "r") as file:
mappings = json.load(file)
webhook_data = {
"identifier": WEBHOOK_IDENTIFIER,
"title": "Bitbucket Webhook",
"description": "Webhook for receiving Bitbucket events",
"icon": "BitBucket",
"mappings": mappings,
"enabled": True,
"security": {
"secret": WEBHOOK_SECRET,
"signatureHeaderName": "x-hub-signature",
"signatureAlgorithm": "sha256",
"signaturePrefix": "sha256=",
"requestIdentifierPath": ".headers['x-request-id']",
},
"integrationType": "custom",
}
response = await send_port_request(
method="POST", endpoint="webhooks", payload=webhook_data
)
if isinstance(response, dict):
if response.get("status_code") == 442:
logger.error("Incorrect mapping, kindly fix!")
return None
else:
webhook_url = response.json().get("integration", {}).get("url")
logger.info(
f"Webhook configuration successfully created in Port: {webhook_url}"
)
return webhook_url
def generate_webhook_data(webhook_url: str, events: list[str]) -> dict:
return {
"name": "Port Webhook",
"url": webhook_url,
"events": events,
"active": True,
"sslVerificationRequired": True,
"configuration": {"secret": WEBHOOK_SECRET, "createdBy": "Port"},
}
async def create_project_level_webhook(
project_key: str, webhook_url: str, events: list[str]
):
logger.info(f"Creating project-level webhook for project: {project_key}")
webhook_data = generate_webhook_data(webhook_url, events)
try:
response = await client.post(
f"{BITBUCKET_API_URL}/rest/api/1.0/projects/{project_key}/webhooks",
json=webhook_data,
auth=bitbucket_auth,
)
response.raise_for_status()
logger.info(f"Successfully created project-level webhook for {project_key}")
return response.json()
except httpx.HTTPStatusError as e:
logger.error(
f"HTTP error when creating webhook for project: {project_key} code: {e.response.status_code} response: {e.response.text}"
)
return None
async def create_repo_level_webhook(
project_key: str, repo_key: str, webhook_url: str, events: list[str]
):
logger.info(f"Creating repo-level webhook for repo: {repo_key}")
webhook_data = generate_webhook_data(webhook_url, events)
try:
response = await client.post(
f"{BITBUCKET_API_URL}/rest/api/1.0/projects/{project_key}/repos/{repo_key}/webhooks",
json=webhook_data,
auth=bitbucket_auth,
)
response.raise_for_status()
logger.info(f"Successfully created repo-level webhook for {repo_key}")
return response.json()
except httpx.HTTPStatusError as e:
logger.error(
f"HTTP error when creating webhook for repo: {repo_key} code: {e.response.status_code} response: {e.response.text}"
)
return None
async def get_or_create_bitbucket_webhook(
project_key: str,
webhook_url: str,
events: list[str],
repo_key: Optional[str] = None,
):
logger.info(f"Checking webhooks for {repo_key or project_key}")
if webhook_url is not None:
try:
matching_webhooks = [
webhook
async for project_webhooks_batch in get_paginated_resource(
path=(
f"projects/{project_key}/repos/{repo_key}/webhooks"
if repo_key
else f"projects/{project_key}/webhooks"
)
)
for webhook in project_webhooks_batch
if webhook["url"] == webhook_url
]
if matching_webhooks:
logger.info(f"Webhook already exists for {repo_key or project_key}.")
return matching_webhooks[0]
logger.info(
f"Webhook not found for {repo_key or project_key}. Creating a new one."
)
if repo_key:
return await create_repo_level_webhook(
project_key, repo_key, webhook_url, events
)
else:
return await create_project_level_webhook(
project_key, webhook_url, events
)
except httpx.HTTPStatusError as e:
logger.error(
f"HTTP error when checking webhooks for project: {project_key} code: {e.response.status_code} response: {e.response.text}"
)
return None
else:
logger.error("Port webhook URL is not available. Skipping webhook check...")
return None
async def add_entity_to_port(blueprint_id, entity_object):
response = await send_port_request(
method="POST",
endpoint=f"blueprints/{blueprint_id}/entities?upsert=true&merge=true",
payload=entity_object,
)
if not isinstance(response, dict):
logger.info(response.json())
async def get_paginated_resource(
path: str,
params: dict[str, Any] = None,
page_size: int = 25,
full_response: bool = False,
):
global request_count, rate_limit_start
# Check if we've exceeded the rate limit, and if so, wait until the reset period is over
if request_count >= RATE_LIMIT:
elapsed_time = time.time() - rate_limit_start
if elapsed_time < RATE_PERIOD:
sleep_time = RATE_PERIOD - elapsed_time
await asyncio.sleep(sleep_time)
# Reset the rate limiting variables
request_count = 0
rate_limit_start = time.time()
url = f"{BITBUCKET_API_URL}/rest/api/1.0/{path}"
params = params or {}
params["limit"] = page_size
next_page_start = None
while True:
try:
if next_page_start:
params["start"] = next_page_start
response = await client.get(url=url, auth=bitbucket_auth, params=params)
response.raise_for_status()
page_json = response.json()
request_count += 1
logger.debug(
f"Requested data for {path}, with params: {params} and response code: {response.status_code}"
)
if full_response:
yield page_json
else:
batch_data = page_json["values"]
yield batch_data
next_page_start = page_json.get("nextPageStart")
if not next_page_start:
break
except httpx.HTTPStatusError as e:
if e.response.status_code == 404:
logger.info(
f"Could not find the requested resources {path}. Terminating gracefully..."
)
return
logger.error(
f"HTTP error with code {e.response.status_code}, content: {e.response.text}"
)
except httpx.HTTPError as e:
logger.error(f"HTTP occurred while fetching Bitbucket data: {e}")
logger.info(f"Successfully fetched paginated data for {path}")
async def get_single_project(project_key: str):
response = await client.get(
f"{BITBUCKET_API_URL}/rest/api/1.0/projects/{project_key}", auth=bitbucket_auth
)
response.raise_for_status()
return response.json()
def convert_to_datetime(timestamp: int):
converted_datetime = datetime.utcfromtimestamp(timestamp / 1000.0)
return converted_datetime.strftime("%Y-%m-%dT%H:%M:%SZ")
def parse_repository_file_response(file_response: dict[str, Any]) -> str:
lines = file_response.get("lines", [])
logger.info(f"Received readme file with {len(lines)} entries")
readme_content = ""
for line in lines:
readme_content += line.get("text", "") + "\n"
return readme_content
async def process_user_entities(users_data: list[dict[str, Any]]):
blueprint_id = "bitbucketUser"
for user in users_data:
entity = {
"title": user.get("displayName"),
"properties": {
"username": user.get("name"),
"url": user.get("links", {}).get("self", [{}])[0].get("href"),
},
"relations": {},
}
identifier = str(user.get("emailAddress"))
if identifier:
entity["identifier"] = sanitize_identifier(identifier)
await add_entity_to_port(blueprint_id=blueprint_id, entity_object=entity)
async def process_project_entities(projects_data: list[dict[str, Any]]):
blueprint_id = "bitbucketProject"
for project in projects_data:
entity = {
"title": project.get("name"),
"properties": {
"description": project.get("description"),
"public": project.get("public"),
"type": project.get("type"),
"link": project.get("links", {}).get("self", [{}])[0].get("href"),
},
"relations": {},
}
identifier = str(project.get("key"))
if identifier:
entity["identifier"] = sanitize_identifier(identifier)
await add_entity_to_port(blueprint_id=blueprint_id, entity_object=entity)
async def process_repository_entities(repository_data: list[dict[str, Any]]):
blueprint_id = "bitbucketRepository"
for repo in repository_data:
readme_content = await get_repository_readme(
project_key=repo["project"]["key"], repo_slug=repo["slug"]
)
entity = {
"title": repo.get("name"),
"properties": {
"description": repo.get("description"),
"state": repo.get("state"),
"forkable": repo.get("forkable"),
"public": repo.get("public"),
"link": repo.get("links", {}).get("self", [{}])[0].get("href"),
"documentation": readme_content,
"swagger_url": f"https://api.{repo.get('slug')}.com",
},
"relations": dict(
project=repo.get("project", {}).get("key"),
latestCommitAuthor=repo.get("__latestCommit", {})
.get("committer", {})
.get("emailAddress"),
),
}
identifier = str(repo.get("slug"))
if identifier:
entity["identifier"] = sanitize_identifier(identifier)
await add_entity_to_port(blueprint_id=blueprint_id, entity_object=entity)
async def process_pullrequest_entities(pullrequest_data: list[dict[str, Any]]):
blueprint_id = "bitbucketPullrequest"
for pr in pullrequest_data:
entity = {
"title": pr.get("title"),
"properties": {
"created_on": convert_to_datetime(pr.get("createdDate")),
"updated_on": convert_to_datetime(pr.get("updatedDate")),
"mergedAt": convert_to_datetime(pr.get("closedDate", 0)),
"merge_commit": pr.get("fromRef", {}).get("latestCommit"),
"description": pr.get("description"),
"state": pr.get("state"),
"owner": pr.get("author", {}).get("user", {}).get("emailAddress"),
"link": pr.get("links", {}).get("self", [{}])[0].get("href"),
"destination": pr.get("toRef", {}).get("displayId"),
"reviewers": [
reviewer_email
for reviewer in pr.get("reviewers", [])
if (reviewer_email := reviewer.get("user", {}).get("emailAddress"))
],
"source": pr.get("fromRef", {}).get("displayId"),
},
"relations": {
"repository": pr["toRef"]["repository"]["slug"],
"participants": [
email
for email in [
pr.get("author", {}).get("user", {}).get("emailAddress")
]
+ [
user.get("user", {}).get("emailAddress", "")
for user in pr.get("participants", [])
]
if email
],
},
}
identifier = str(pr.get("id"))
if identifier:
entity["identifier"] = sanitize_identifier(identifier)
await add_entity_to_port(blueprint_id=blueprint_id, entity_object=entity)
async def get_repository_readme(project_key: str, repo_slug: str) -> str:
file_path = f"projects/{project_key}/repos/{repo_slug}/browse/README.md"
readme_content = ""
async for readme_file_batch in get_paginated_resource(
path=file_path, page_size=500, full_response=True
):
file_content = parse_repository_file_response(readme_file_batch)
readme_content += file_content
return readme_content
async def get_latest_commit(project_key: str, repo_slug: str) -> dict[str, Any]:
try:
commit_path = f"projects/{project_key}/repos/{repo_slug}/commits"
async for commit_batch in get_paginated_resource(path=commit_path):
if commit_batch:
latest_commit = commit_batch[0]
return latest_commit
return {}
except Exception as e:
logger.error(f"Error fetching latest commit for repo {repo_slug}: {e}")
return {}
async def get_repositories(project: dict[str, Any], port_webhook_url: str):
repositories_path = f"projects/{project['key']}/repos"
async for repositories_batch in get_paginated_resource(path=repositories_path):
logger.info(
f"received repositories batch with size {len(repositories_batch)} from project: {project['key']}"
)
await process_repository_entities(
repository_data=[
{
**repo,
"__latestCommit": await get_latest_commit(
project_key=project["key"], repo_slug=repo["slug"]
),
}
for repo in repositories_batch
]
)
if IS_VERSION_8_7_OR_OLDER:
[
await get_or_create_bitbucket_webhook(
project_key=project["key"],
repo_key=repo["slug"],
webhook_url=port_webhook_url,
events=WEBHOOK_EVENTS,
)
for repo in repositories_batch
]
await get_repository_pull_requests(repository_batch=repositories_batch)
async def get_repository_pull_requests(repository_batch: list[dict[str, Any]]):
global PULL_REQUEST_STATE
for repository in repository_batch:
pull_requests_path = f"projects/{repository['project']['key']}/repos/{repository['slug']}/pull-requests"
if PULL_REQUEST_STATE not in VALID_PULL_REQUEST_STATES:
logger.warning(
f"Invalid PULL_REQUEST_STATE '{PULL_REQUEST_STATE}' provided. Defaulting to 'OPEN'."
)
PULL_REQUEST_STATE = "OPEN"
async for pull_requests_batch in get_paginated_resource(
path=pull_requests_path,
params={"state": PULL_REQUEST_STATE},
):
logger.info(
f"received pull requests batch with size {len(pull_requests_batch)} from repo: {repository['slug']}"
)
await process_pullrequest_entities(pullrequest_data=pull_requests_batch)
async def main():
logger.info("Starting Bitbucket data extraction")
async for users_batch in get_paginated_resource(path="admin/users"):
logger.info(f"received users batch with size {len(users_batch)}")
await process_user_entities(users_data=users_batch)
project_path = "projects"
if BITBUCKET_PROJECTS_FILTER:
async def filtered_projects_generator():
yield [await get_single_project(key) for key in BITBUCKET_PROJECTS_FILTER]
projects = filtered_projects_generator()
else:
projects = get_paginated_resource(path=project_path)
port_webhook_url = await get_or_create_port_webhook()
if not port_webhook_url:
logger.error("Failed to get or create Port webhook. Skipping webhook setup...")
async for projects_batch in projects:
logger.info(f"received projects batch with size {len(projects_batch)}")
await process_project_entities(projects_data=projects_batch)
for project in projects_batch:
await get_repositories(project=project, port_webhook_url=port_webhook_url)
if not IS_VERSION_8_7_OR_OLDER:
await get_or_create_bitbucket_webhook(
project_key=project["key"],
webhook_url=port_webhook_url,
events=WEBHOOK_EVENTS,
)
logger.info("Bitbucket data extraction completed")
await client.aclose()
if __name__ == "__main__":
asyncio.run(main())Bitbucket webhook configuration
[
{
"blueprint": "bitbucketProject",
"filter": ".body.eventKey == \"project:modified\"",
"entity": {
"identifier": ".body.new.key | tostring",
"title": ".body.new.name",
"properties": {
"public": ".body.new.public",
"type": ".body.new.type",
"description": ".body.new.description",
"link": ".body.new.links.self[0].href"
}
}
},
{
"blueprint": "bitbucketRepository",
"filter": ".body.eventKey == \"repo:modified\"",
"entity": {
"identifier": ".body.new.slug",
"title": ".body.new.name",
"properties": {
"description": ".body.new.description",
"state": ".body.new.state",
"forkable": ".body.new.forkable",
"public": ".body.new.public",
"link": ".body.new.links.self[0].href"
},
"relations": {
"project": ".body.new.project.key"
}
}
},
{
"blueprint": "bitbucketPullrequest",
"filter": ".body.eventKey | startswith(\"pr:\")",
"entity": {
"identifier": ".body.pullRequest.id | tostring",
"title": ".body.pullRequest.title",
"properties": {
"created_on": ".body.pullRequest.createdDate | (tonumber / 1000 | strftime(\"%Y-%m-%dT%H:%M:%SZ\"))",
"updated_on": ".body.pullRequest.updatedDate | (tonumber / 1000 | strftime(\"%Y-%m-%dT%H:%M:%SZ\"))",
"merge_commit": ".body.pullRequest.fromRef.latestCommit",
"state": ".body.pullRequest.state",
"owner": ".body.pullRequest.author.user.emailAddress",
"link": ".body.pullRequest.links.self[0].href",
"destination": ".body.pullRequest.toRef.displayId",
"source": ".body.pullRequest.fromRef.displayId",
"mergedAt": ".body.pullRequest.closedDate | (tonumber / 1000 | strftime(\"%Y-%m-%dT%H:%M:%SZ\"))",
"reviewers": "[.body.pullRequest.reviewers[].user.emailAddress]"
},
"relations": {
"repository": ".body.pullRequest.toRef.repository.slug",
"participants": "[.body.pullRequest.participants[].user.emailAddress]"
}
}
}
]
Done! you are now able to import historical users, projects, repositories and pull requests from Bitbucket into Port and any change that happens to your project, repository or pull requests in Bitbucket will trigger a webhook event to the webhook URL provided by Port. Port will parse the events and objects according to the mapping and update the catalog entities accordingly.