superset 2019-07-15_12-00_190188938582_adding_unique_constraint_on_dashboard_slices_tbl 源码

  • 2022-10-20
  • 浏览 (351)

superset 2019-07-15_12-00_190188938582_adding_unique_constraint_on_dashboard_slices_tbl 代码

文件路径:/superset/migrations/versions/2019-07-15_12-00_190188938582_adding_unique_constraint_on_dashboard_slices_tbl.py

# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
#   http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied.  See the License for the
# specific language governing permissions and limitations
# under the License.
"""Remove duplicated entries in dashboard_slices table and add unique constraint

Revision ID: 190188938582
Revises: d6ffdf31bdd4
Create Date: 2019-07-15 12:00:32.267507

"""
import logging

from alembic import op
from sqlalchemy import and_, Column, ForeignKey, Integer, Table
from sqlalchemy.ext.declarative import declarative_base

from superset import db

# revision identifiers, used by Alembic.
revision = "190188938582"
down_revision = "d6ffdf31bdd4"

Base = declarative_base()


class DashboardSlices(Base):
    __tablename__ = "dashboard_slices"
    id = Column(Integer, primary_key=True)
    dashboard_id = Column(Integer, ForeignKey("dashboards.id"))
    slice_id = Column(Integer, ForeignKey("slices.id"))


def upgrade():
    bind = op.get_bind()
    session = db.Session(bind=bind)

    # find dup records in dashboard_slices tbl
    dup_records = (
        session.query(
            DashboardSlices.dashboard_id,
            DashboardSlices.slice_id,
            db.func.count(DashboardSlices.id),
        )
        .group_by(DashboardSlices.dashboard_id, DashboardSlices.slice_id)
        .having(db.func.count(DashboardSlices.id) > 1)
        .all()
    )

    # remove dup entries
    for record in dup_records:
        print(
            "remove duplicates from dashboard {} slice {}".format(
                record.dashboard_id, record.slice_id
            )
        )

        ids = [
            item.id
            for item in session.query(DashboardSlices.id)
            .filter(
                and_(
                    DashboardSlices.slice_id == record.slice_id,
                    DashboardSlices.dashboard_id == record.dashboard_id,
                )
            )
            .offset(1)
        ]
        session.query(DashboardSlices).filter(DashboardSlices.id.in_(ids)).delete(
            synchronize_session=False
        )

    # add unique constraint
    try:
        with op.batch_alter_table("dashboard_slices") as batch_op:
            batch_op.create_unique_constraint(
                "uq_dashboard_slice", ["dashboard_id", "slice_id"]
            )
    except Exception as ex:
        logging.exception(ex)


def downgrade():
    try:
        with op.batch_alter_table("dashboard_slices") as batch_op:
            batch_op.drop_constraint("uq_dashboard_slice", type_="unique")
    except Exception as ex:
        logging.exception(ex)

相关信息

superset 源码目录

相关文章

superset 2015-09-21_17-30_4e6a06bad7a8_init 源码

superset 2015-10-05_10-325a7bad26f2a7 源码

superset 2015-10-05_22-111e2841a4128 源码

superset 2015-10-19_20-54_2929af7925ed_tz_offsets_in_data_sources 源码

superset 2015-11-21_11-18_289ce07647b_add_encrypted_password_field 源码

superset 2015-12-04_09-42_1a48a5411020_adding_slug_to_dash 源码

superset 2015-12-04_11-16_315b3f4da9b0_adding_log_model 源码

superset 2015-12-13_08-38_55179c7f25c7_sqla_descr 源码

superset 2015-12-14_13-37_12d55656cbca_is_featured 源码

superset 2015-12-15_17-02_2591d77e9831_user_id 源码

0  赞