Skip to content

Dataset

Contained within this file are experimental interfaces for working with the Synapse Python Client. Unless otherwise noted these interfaces are subject to change at any time. Use at your own risk.

API reference

synapseclient.models.Dataset dataclass

Bases: DatasetSynchronousProtocol, AccessControllable, ViewBase, ViewStoreMixin, DeleteMixin, ColumnMixin, GetMixin, QueryMixin, ViewUpdateMixin, ViewSnapshotMixin

A Dataset object represents the metadata of a Synapse Dataset. https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/Dataset.html

ATTRIBUTE DESCRIPTION
id

The unique immutable ID for this dataset. A new ID will be generated for new Datasets. Once issued, this ID is guaranteed to never change or be re-issued

TYPE: Optional[str]

name

The name of this dataset. Must be 256 characters or less. Names may only contain: letters, numbers, spaces, underscores, hyphens, periods, plus signs, apostrophes, and parentheses

TYPE: Optional[str]

description

The description of the dataset. Must be 1000 characters or less.

TYPE: Optional[str]

etag

Synapse employs an Optimistic Concurrency Control (OCC) scheme to handle concurrent updates. Since the E-Tag changes every time an entity is updated it is used to detect when a client's current representation of an entity is out-of-date.

TYPE: Optional[str]

created_on

The date this dataset was created.

TYPE: Optional[str]

modified_on

The date this dataset was last modified. In YYYY-MM-DD-Thh:mm:ss.sssZ format

TYPE: Optional[str]

created_by

The ID of the user that created this dataset.

TYPE: Optional[str]

modified_by

The ID of the user that last modified this dataset.

TYPE: Optional[str]

parent_id

The ID of the Entity that is the parent of this dataset.

TYPE: Optional[str]

columns

The columns of this dataset. This is an ordered dictionary where the key is the name of the column and the value is the Column object. When creating a new instance of a Dataset object you may pass any of the following types as the columns argument:

  • A list of Column objects
  • A dictionary where the key is the name of the column and the value is the Column object
  • An OrderedDict where the key is the name of the column and the value is the Column object

The order of the columns will be the order they are stored in Synapse. If you need to reorder the columns the recommended approach is to use the .reorder_column() method. Additionally, you may add, and delete columns using the .add_column(), and .delete_column() methods on your dataset class instance.

You may modify the attributes of the Column object to change the column type, name, or other attributes. For example, suppose you'd like to change a column from a INTEGER to a DOUBLE. You can do so by changing the column type attribute of the Column object. The next time you store the dataset the column will be updated in Synapse with the new type.

from synapseclient import Synapse
from synapseclient.models import Synapse
from synapseclient.models import Column, ColumnType

syn = Synapse()
syn.login()

dataset = Dataset(id="syn1234").get()
dataset.columns["my_column"].column_type = ColumnType.DOUBLE
dataset.store()

Note that the keys in this dictionary should match the column names as they are in Synapse. However, know that the name attribute of the Column object is used for all interactions with the Synapse API. The OrderedDict key is purely for the usage of this interface. For example, if you wish to rename a column you may do so by changing the name attribute of the Column object. The key in the OrderedDict does not need to be changed. The next time you store the dataset the column will be updated in Synapse with the new name and the key in the OrderedDict will be updated.

TYPE: Optional[Union[List[Column], OrderedDict[str, Column], Dict[str, Column]]]

version_number

The version number issued to this version on the object.

TYPE: Optional[int]

version_label

The version label for this dataset.

TYPE: Optional[str]

version_comment

The version comment for this dataset.

TYPE: Optional[str]

is_latest_version

If this is the latest version of the object.

TYPE: Optional[bool]

is_search_enabled

When creating or updating a dataset or view specifies if full text search should be enabled. Note that enabling full text search might slow down the indexing of the dataset or view.

TYPE: Optional[bool]

items

The flat list of file entity references that define this dataset. This is effectively

TYPE: Optional[List[EntityRef]]

size

The cumulative size, in bytes, of all items (files) in the dataset. This is only correct after the dataset has been stored or newly read from Synapse.

TYPE: Optional[int]

checksum

The checksum is computed over a sorted concatenation of the checksums of all items in the dataset. This is only correct after the dataset has been stored or newly read from Synapse.

TYPE: Optional[str]

count

The number of items/files in the dataset. This is only correct after the dataset has been stored or newly read from Synapse.

TYPE: Optional[int]

activity

The Activity model represents the main record of Provenance in Synapse. It is analogous to the Activity defined in the W3C Specification on Provenance.

TYPE: Optional[Activity]

annotations

Additional metadata associated with the dataset. The key is the name of your desired annotations. The value is an object containing a list of values (use empty list to represent no values for key) and the value type associated with all values in the list.

TYPE: Optional[Dict[str, Union[List[str], List[bool], List[float], List[int], List[date], List[datetime]]]]

include_default_columns

When creating a dataset or view, specifies if default columns should be included. Default columns are columns that are automatically added to the dataset or view. These columns are managed by Synapse and cannot be modified. If you attempt to create a column with the same name as a default column, you will receive a warning when you store the dataset.

include_default_columns is only used if this is the first time that the view is being stored. If you are updating an existing view this attribute will be ignored. If you want to add all default columns back to your view then you may use this code snippet to accomplish this:

import asyncio
from synapseclient import Synapse
from synapseclient.models import Dataset

syn = Synapse()
syn.login()

async def main():
    view = await Dataset(id="syn1234").get_async()
    await view._append_default_columns()
    await view.store_async()

asyncio.run(main())

The column you are overriding will not behave the same as a default column. For example, suppose you create a column called id on a Dataset. When using a default column, the id stores the Synapse ID of each of the entities included in the scope of the view. If you override the id column with a new column, the id column will no longer store the Synapse ID of the entities in the view. Instead, it will store the values you provide when you store the dataset. It will be stored as an annotation on the entity for the row you are modifying.

TYPE: Optional[bool]

Create a new dataset from a list of EntityRefs.

Dataset items consist of references to Synapse Files using an Entity Reference. If you are adding items to a Dataset directly, you must provide them in the form of an EntityRef class instance.

from synapseclient import Synapse
from synapseclient.models import Dataset, EntityRef

syn = Synapse()
syn.login()

my_entity_refs = [EntityRef(id="syn1234"), EntityRef(id="syn1235"), EntityRef(id="syn1236")]
my_dataset = Dataset(parent_id="syn987", name="my-new-dataset", items=my_entity_refs)
my_dataset.store()
Add entities to an existing dataset.

Using add_item, you can add Synapse entities that are Files, Folders, or EntityRefs that point to a Synapse entity. If the entity is a Folder (or an EntityRef that points to a folder), all of the child Files within the Folder will be added to the Dataset recursively.

from synapseclient import Synapse
from synapseclient.models import Dataset, File, Folder, EntityRef

syn = Synapse()
syn.login()

my_dataset = Dataset(id="syn1234").get()

# Add a file to the dataset
my_dataset.add_item(File(id="syn1235"))

# Add a folder to the dataset
# All child files are recursively added to the dataset
my_dataset.add_item(Folder(id="syn1236"))

# Add an entity reference to the dataset
my_dataset.add_item(EntityRef(id="syn1237", version=1))

my_dataset.store()
Remove entities from a dataset.

 

from synapseclient import Synapse
from synapseclient.models import Dataset, File, Folder, EntityRef

syn = Synapse()
syn.login()

my_dataset = Dataset(id="syn1234").get()

# Remove a file from the dataset
my_dataset.remove_item(File(id="syn1235"))

# Remove a folder from the dataset
# All child files are recursively removed from the dataset
my_dataset.remove_item(Folder(id="syn1236"))

# Remove an entity reference from the dataset
my_dataset.remove_item(EntityRef(id="syn1237", version=1))

my_dataset.store()
Query data from a dataset.

 

from synapseclient import Synapse
from synapseclient.models import Dataset

syn = Synapse()
syn.login()

my_dataset = Dataset(id="syn1234").get()
row = my_dataset.query(query="SELECT * FROM syn1234 WHERE id = 'syn1235'")
print(row)
Add a custom column to a dataset.

 

from synapseclient import Synapse
from synapseclient.models import Dataset, Column, ColumnType

syn = Synapse()
syn.login()

my_dataset = Dataset(id="syn1234").get()
my_dataset.add_column(Column(name="my_annotation", column_type=ColumnType.STRING))
my_dataset.store()
Update custom column values in a dataset.

 

from synapseclient import Synapse
from synapseclient.models import Dataset

syn = Synapse()
syn.login()

my_dataset = Dataset(id="syn1234").get()
# my_annotation must already exist in the dataset as a custom column
modified_data = pd.DataFrame(
    {"id": ["syn1234"], "my_annotation": ["good data"]}
)
my_dataset.update_rows(values=modified_data, primary_keys=["id"], dry_run=False)
Save a snapshot of a dataset.

 

from synapseclient import Synapse
from synapseclient.models import Dataset

syn = Synapse()
syn.login()

my_dataset = Dataset(id="syn1234").get()
my_dataset.snapshot(comment="My first snapshot", label="My first snapshot")
Deleting a dataset

 

from synapseclient import Synapse
from synapseclient.models import Dataset

syn = Synapse()
syn.login()

Dataset(id="syn4567").delete()

Source code in synapseclient/models/dataset.py
 380
 381
 382
 383
 384
 385
 386
 387
 388
 389
 390
 391
 392
 393
 394
 395
 396
 397
 398
 399
 400
 401
 402
 403
 404
 405
 406
 407
 408
 409
 410
 411
 412
 413
 414
 415
 416
 417
 418
 419
 420
 421
 422
 423
 424
 425
 426
 427
 428
 429
 430
 431
 432
 433
 434
 435
 436
 437
 438
 439
 440
 441
 442
 443
 444
 445
 446
 447
 448
 449
 450
 451
 452
 453
 454
 455
 456
 457
 458
 459
 460
 461
 462
 463
 464
 465
 466
 467
 468
 469
 470
 471
 472
 473
 474
 475
 476
 477
 478
 479
 480
 481
 482
 483
 484
 485
 486
 487
 488
 489
 490
 491
 492
 493
 494
 495
 496
 497
 498
 499
 500
 501
 502
 503
 504
 505
 506
 507
 508
 509
 510
 511
 512
 513
 514
 515
 516
 517
 518
 519
 520
 521
 522
 523
 524
 525
 526
 527
 528
 529
 530
 531
 532
 533
 534
 535
 536
 537
 538
 539
 540
 541
 542
 543
 544
 545
 546
 547
 548
 549
 550
 551
 552
 553
 554
 555
 556
 557
 558
 559
 560
 561
 562
 563
 564
 565
 566
 567
 568
 569
 570
 571
 572
 573
 574
 575
 576
 577
 578
 579
 580
 581
 582
 583
 584
 585
 586
 587
 588
 589
 590
 591
 592
 593
 594
 595
 596
 597
 598
 599
 600
 601
 602
 603
 604
 605
 606
 607
 608
 609
 610
 611
 612
 613
 614
 615
 616
 617
 618
 619
 620
 621
 622
 623
 624
 625
 626
 627
 628
 629
 630
 631
 632
 633
 634
 635
 636
 637
 638
 639
 640
 641
 642
 643
 644
 645
 646
 647
 648
 649
 650
 651
 652
 653
 654
 655
 656
 657
 658
 659
 660
 661
 662
 663
 664
 665
 666
 667
 668
 669
 670
 671
 672
 673
 674
 675
 676
 677
 678
 679
 680
 681
 682
 683
 684
 685
 686
 687
 688
 689
 690
 691
 692
 693
 694
 695
 696
 697
 698
 699
 700
 701
 702
 703
 704
 705
 706
 707
 708
 709
 710
 711
 712
 713
 714
 715
 716
 717
 718
 719
 720
 721
 722
 723
 724
 725
 726
 727
 728
 729
 730
 731
 732
 733
 734
 735
 736
 737
 738
 739
 740
 741
 742
 743
 744
 745
 746
 747
 748
 749
 750
 751
 752
 753
 754
 755
 756
 757
 758
 759
 760
 761
 762
 763
 764
 765
 766
 767
 768
 769
 770
 771
 772
 773
 774
 775
 776
 777
 778
 779
 780
 781
 782
 783
 784
 785
 786
 787
 788
 789
 790
 791
 792
 793
 794
 795
 796
 797
 798
 799
 800
 801
 802
 803
 804
 805
 806
 807
 808
 809
 810
 811
 812
 813
 814
 815
 816
 817
 818
 819
 820
 821
 822
 823
 824
 825
 826
 827
 828
 829
 830
 831
 832
 833
 834
 835
 836
 837
 838
 839
 840
 841
 842
 843
 844
 845
 846
 847
 848
 849
 850
 851
 852
 853
 854
 855
 856
 857
 858
 859
 860
 861
 862
 863
 864
 865
 866
 867
 868
 869
 870
 871
 872
 873
 874
 875
 876
 877
 878
 879
 880
 881
 882
 883
 884
 885
 886
 887
 888
 889
 890
 891
 892
 893
 894
 895
 896
 897
 898
 899
 900
 901
 902
 903
 904
 905
 906
 907
 908
 909
 910
 911
 912
 913
 914
 915
 916
 917
 918
 919
 920
 921
 922
 923
 924
 925
 926
 927
 928
 929
 930
 931
 932
 933
 934
 935
 936
 937
 938
 939
 940
 941
 942
 943
 944
 945
 946
 947
 948
 949
 950
 951
 952
 953
 954
 955
 956
 957
 958
 959
 960
 961
 962
 963
 964
 965
 966
 967
 968
 969
 970
 971
 972
 973
 974
 975
 976
 977
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
@dataclass
@async_to_sync
class Dataset(
    DatasetSynchronousProtocol,
    AccessControllable,
    ViewBase,
    ViewStoreMixin,
    DeleteMixin,
    ColumnMixin,
    GetMixin,
    QueryMixin,
    ViewUpdateMixin,
    ViewSnapshotMixin,
):
    """A `Dataset` object represents the metadata of a Synapse Dataset.
    <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/Dataset.html>

    Attributes:
        id: The unique immutable ID for this dataset. A new ID will be generated for new
            Datasets. Once issued, this ID is guaranteed to never change or be re-issued
        name: The name of this dataset. Must be 256 characters or less. Names may only
            contain: letters, numbers, spaces, underscores, hyphens, periods, plus
            signs, apostrophes, and parentheses
        description: The description of the dataset. Must be 1000 characters or less.
        etag: Synapse employs an Optimistic Concurrency Control (OCC) scheme to handle
            concurrent updates. Since the E-Tag changes every time an entity is updated
            it is used to detect when a client's current representation of an entity is
            out-of-date.
        created_on: The date this dataset was created.
        modified_on: The date this dataset was last modified.
            In YYYY-MM-DD-Thh:mm:ss.sssZ format
        created_by: The ID of the user that created this dataset.
        modified_by: The ID of the user that last modified this dataset.
        parent_id: The ID of the Entity that is the parent of this dataset.
        columns: The columns of this dataset. This is an ordered dictionary where the key is the
            name of the column and the value is the Column object. When creating a new instance
            of a Dataset object you may pass any of the following types as the `columns` argument:

            - A list of Column objects
            - A dictionary where the key is the name of the column and the value is the Column object
            - An OrderedDict where the key is the name of the column and the value is the Column object

            The order of the columns will be the order they are stored in Synapse. If you need
            to reorder the columns the recommended approach is to use the `.reorder_column()`
            method. Additionally, you may add, and delete columns using the `.add_column()`,
            and `.delete_column()` methods on your dataset class instance.

            You may modify the attributes of the Column object to change the column
            type, name, or other attributes. For example, suppose you'd like to change a
            column from a INTEGER to a DOUBLE. You can do so by changing the column type
            attribute of the Column object. The next time you store the dataset the column
            will be updated in Synapse with the new type.

            ```python
            from synapseclient import Synapse
            from synapseclient.models import Synapse
            from synapseclient.models import Column, ColumnType

            syn = Synapse()
            syn.login()

            dataset = Dataset(id="syn1234").get()
            dataset.columns["my_column"].column_type = ColumnType.DOUBLE
            dataset.store()
            ```

            Note that the keys in this dictionary should match the column names as they are in
            Synapse. However, know that the name attribute of the Column object is used for
            all interactions with the Synapse API. The OrderedDict key is purely for the usage
            of this interface. For example, if you wish to rename a column you may do so by
            changing the name attribute of the Column object. The key in the OrderedDict does
            not need to be changed. The next time you store the dataset the column will be updated
            in Synapse with the new name and the key in the OrderedDict will be updated.
        version_number: The version number issued to this version on the object.
        version_label: The version label for this dataset.
        version_comment: The version comment for this dataset.
        is_latest_version: If this is the latest version of the object.
        is_search_enabled: When creating or updating a dataset or view specifies if full
            text search should be enabled. Note that enabling full text search might
            slow down the indexing of the dataset or view.
        items: The flat list of file entity references that define this dataset. This is effectively
        a list of the rows that are in/will be in the dataset after it is stored. The only way to add
        or remove rows is to add or remove items from this list.
        size: The cumulative size, in bytes, of all items (files) in the dataset. This is
            only correct after the dataset has been stored or newly read from Synapse.
        checksum: The checksum is computed over a sorted concatenation of the checksums
            of all items in the dataset. This is only correct after the dataset has been
            stored or newly read from Synapse.
        count: The number of items/files in the dataset. This is only correct after the
            dataset has been stored or newly read from Synapse.
        activity: The Activity model represents the main record of Provenance in
            Synapse. It is analogous to the Activity defined in the
            [W3C Specification](https://www.w3.org/TR/prov-n/) on Provenance.
        annotations: Additional metadata associated with the dataset. The key is the name
            of your desired annotations. The value is an object containing a list of
            values (use empty list to represent no values for key) and the value type
            associated with all values in the list.
        include_default_columns: When creating a dataset or view, specifies if default
            columns should be included. Default columns are columns that are
            automatically added to the dataset or view. These columns are managed by
            Synapse and cannot be modified. If you attempt to create a column with the
            same name as a default column, you will receive a warning when you store the
            dataset.

            **`include_default_columns` is only used if this is the first time that the
            view is being stored.** If you are updating an existing view this attribute
            will be ignored. If you want to add all default columns back to your view
            then you may use this code snippet to accomplish this:

            ```python
            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import Dataset

            syn = Synapse()
            syn.login()

            async def main():
                view = await Dataset(id="syn1234").get_async()
                await view._append_default_columns()
                await view.store_async()

            asyncio.run(main())
            ```

            The column you are overriding will not behave the same as a default column.
            For example, suppose you create a column called `id` on a Dataset. When
            using a default column, the `id` stores the Synapse ID of each of the
            entities included in the scope of the view. If you override the `id` column
            with a new column, the `id` column will no longer store the Synapse ID of
            the entities in the view. Instead, it will store the values you provide when
            you store the dataset. It will be stored as an annotation on the entity for
            the row you are modifying.

    Example: Create a new dataset from a list of EntityRefs.
        Dataset items consist of references to Synapse Files using an Entity Reference.
        If you are adding items to a Dataset directly, you must provide them in the form of
        an `EntityRef` class instance.

        ```python
        from synapseclient import Synapse
        from synapseclient.models import Dataset, EntityRef

        syn = Synapse()
        syn.login()

        my_entity_refs = [EntityRef(id="syn1234"), EntityRef(id="syn1235"), EntityRef(id="syn1236")]
        my_dataset = Dataset(parent_id="syn987", name="my-new-dataset", items=my_entity_refs)
        my_dataset.store()
        ```

    Example: Add entities to an existing dataset.
        Using `add_item`, you can add Synapse entities that are Files, Folders, or EntityRefs that point to a Synapse entity.
        If the entity is a Folder (or an EntityRef that points to a folder), all of the child Files
        within the Folder will be added to the Dataset recursively.

        ```python
        from synapseclient import Synapse
        from synapseclient.models import Dataset, File, Folder, EntityRef

        syn = Synapse()
        syn.login()

        my_dataset = Dataset(id="syn1234").get()

        # Add a file to the dataset
        my_dataset.add_item(File(id="syn1235"))

        # Add a folder to the dataset
        # All child files are recursively added to the dataset
        my_dataset.add_item(Folder(id="syn1236"))

        # Add an entity reference to the dataset
        my_dataset.add_item(EntityRef(id="syn1237", version=1))

        my_dataset.store()
        ```

    Example: Remove entities from a dataset.
        &nbsp;


        ```python
        from synapseclient import Synapse
        from synapseclient.models import Dataset, File, Folder, EntityRef

        syn = Synapse()
        syn.login()

        my_dataset = Dataset(id="syn1234").get()

        # Remove a file from the dataset
        my_dataset.remove_item(File(id="syn1235"))

        # Remove a folder from the dataset
        # All child files are recursively removed from the dataset
        my_dataset.remove_item(Folder(id="syn1236"))

        # Remove an entity reference from the dataset
        my_dataset.remove_item(EntityRef(id="syn1237", version=1))

        my_dataset.store()
        ```

    Example: Query data from a dataset.
        &nbsp;

        ```python
        from synapseclient import Synapse
        from synapseclient.models import Dataset

        syn = Synapse()
        syn.login()

        my_dataset = Dataset(id="syn1234").get()
        row = my_dataset.query(query="SELECT * FROM syn1234 WHERE id = 'syn1235'")
        print(row)
        ```

    Example: Add a custom column to a dataset.
        &nbsp;

        ```python
        from synapseclient import Synapse
        from synapseclient.models import Dataset, Column, ColumnType

        syn = Synapse()
        syn.login()

        my_dataset = Dataset(id="syn1234").get()
        my_dataset.add_column(Column(name="my_annotation", column_type=ColumnType.STRING))
        my_dataset.store()
        ```

    Example: Update custom column values in a dataset.
        &nbsp;

        ```python
        from synapseclient import Synapse
        from synapseclient.models import Dataset

        syn = Synapse()
        syn.login()

        my_dataset = Dataset(id="syn1234").get()
        # my_annotation must already exist in the dataset as a custom column
        modified_data = pd.DataFrame(
            {"id": ["syn1234"], "my_annotation": ["good data"]}
        )
        my_dataset.update_rows(values=modified_data, primary_keys=["id"], dry_run=False)
        ```

    Example: Save a snapshot of a dataset.
        &nbsp;

        ```python
        from synapseclient import Synapse
        from synapseclient.models import Dataset

        syn = Synapse()
        syn.login()

        my_dataset = Dataset(id="syn1234").get()
        my_dataset.snapshot(comment="My first snapshot", label="My first snapshot")
        ```

    Example: Deleting a dataset
        &nbsp;
        ```python
        from synapseclient import Synapse
        from synapseclient.models import Dataset

        syn = Synapse()
        syn.login()

        Dataset(id="syn4567").delete()
        ```
    """

    id: Optional[str] = None
    """The unique immutable ID for this dataset. A new ID will be generated for new
    datasets. Once issued, this ID is guaranteed to never change or be re-issued"""

    name: Optional[str] = None
    """The name of this dataset. Must be 256 characters or less. Names may only
    contain: letters, numbers, spaces, underscores, hyphens, periods, plus signs,
    apostrophes, and parentheses"""

    description: Optional[str] = None
    """The description of this entity. Must be 1000 characters or less."""

    etag: Optional[str] = field(default=None, compare=False)
    """
    Synapse employs an Optimistic Concurrency Control (OCC) scheme to handle
    concurrent updates. Since the E-Tag changes every time an entity is updated it is
    used to detect when a client's current representation of an entity is out-of-date.
    """

    created_on: Optional[str] = field(default=None, compare=False)
    """The date this dataset was created."""

    modified_on: Optional[str] = field(default=None, compare=False)
    """The date this dataset was last modified. In YYYY-MM-DD-Thh:mm:ss.sssZ format"""

    created_by: Optional[str] = field(default=None, compare=False)
    """The ID of the user that created this dataset."""

    modified_by: Optional[str] = field(default=None, compare=False)
    """The ID of the user that last modified this dataset."""

    parent_id: Optional[str] = None
    """The ID of the Entity that is the parent of this dataset."""

    version_number: Optional[int] = field(default=None, compare=False)
    """The version number issued to this version on the object."""

    version_label: Optional[str] = None
    """The version label for this dataset."""

    version_comment: Optional[str] = None
    """The version comment for this dataset."""

    is_latest_version: Optional[bool] = field(default=None, compare=False)
    """If this is the latest version of the object."""

    is_search_enabled: Optional[bool] = None
    """When creating or updating a dataset or view specifies if full text search
    should be enabled. Note that enabling full text search might slow down the
    indexing of the dataset or view."""

    items: Optional[List[EntityRef]] = field(default_factory=list, compare=False)
    """The flat list of file entity references that define this dataset."""

    size: Optional[int] = field(default=None, compare=False)
    """The cumulative size, in bytes, of all items(files) in the dataset.

    This is only correct after the dataset has been stored or newly read from Synapse.
    """

    checksum: Optional[str] = field(default=None, compare=False)
    """The checksum is computed over a sorted concatenation of the checksums of all
    items in the dataset.

    This is only correct after the dataset has been stored or newly read from Synapse.
    """

    count: Optional[int] = field(default=None, compare=False)
    """The number of items/files in the dataset.

    This is only correct after the dataset has been stored or newly read from Synapse.
    """

    columns: Optional[
        Union[List[Column], OrderedDict[str, Column], Dict[str, Column]]
    ] = field(default_factory=OrderedDict, compare=False)
    """
    The columns of this dataset. This is an ordered dictionary where the key is the
    name of the column and the value is the Column object. When creating a new instance
    of a Dataset object you may pass any of the following types as the `columns` argument:

    - A list of Column objects
    - A dictionary where the key is the name of the column and the value is the Column object
    - An OrderedDict where the key is the name of the column and the value is the Column object

    The order of the columns will be the order they are stored in Synapse. If you need
    to reorder the columns the recommended approach is to use the `.reorder_column()`
    method. Additionally, you may add, and delete columns using the `.add_column()`,
    and `.delete_column()` methods on your dataset class instance.

    You may modify the attributes of the Column object to change the column
    type, name, or other attributes. For example, suppose you'd like to change a
    column from a INTEGER to a DOUBLE. You can do so by changing the column type
    attribute of the Column object. The next time you store the dataset the column
    will be updated in Synapse with the new type.

    ```python
    from synapseclient import Synapse
    from synapseclient.models import Table, Column, ColumnType

    syn = Synapse()
    syn.login()

    dataset = Dataset(id="syn1234").get()
    dataset.columns["my_column"].column_type = ColumnType.DOUBLE
    dataset.store()
    ```

    Note that the keys in this dictionary should match the column names as they are in
    Synapse. However, know that the name attribute of the Column object is used for
    all interactions with the Synapse API. The OrderedDict key is purely for the usage
    of this interface. For example, if you wish to rename a column you may do so by
    changing the name attribute of the Column object. The key in the OrderedDict does
    not need to be changed. The next time you store the dataset the column will be updated
    in Synapse with the new name and the key in the OrderedDict will be updated.
    """

    _columns_to_delete: Optional[Dict[str, Column]] = field(default_factory=dict)
    """
    Columns to delete when the dataset is stored. The key in this dict is the ID of the
    column to delete. The value is the Column object that represents the column to
    delete.
    """

    activity: Optional[Activity] = field(default=None, compare=False)
    """The Activity model represents the main record of Provenance in Synapse.  It is
    analogous to the Activity defined in the
    [W3C Specification](https://www.w3.org/TR/prov-n/) on Provenance."""

    annotations: Optional[
        Dict[
            str,
            Union[
                List[str],
                List[bool],
                List[float],
                List[int],
                List[date],
                List[datetime],
            ],
        ]
    ] = field(default_factory=dict, compare=False)
    """Additional metadata associated with the dataset. The key is the name of your
    desired annotations. The value is an object containing a list of values
    (use empty list to represent no values for key) and the value type associated with
    all values in the list. To remove all annotations set this to an empty dict `{}`"""

    _last_persistent_instance: Optional["Dataset"] = field(
        default=None, repr=False, compare=False
    )
    """The last persistent instance of this object. This is used to determine if the
    object has been changed and needs to be updated in Synapse."""

    view_entity_type: ViewEntityType = ViewEntityType.DATASET
    """The API model string for the type of view. This is used to determine the default columns that are
    added to the table. Must be defined as a `ViewEntityType` enum.
    """

    view_type_mask: ViewTypeMask = ViewTypeMask.DATASET
    """The Bit Mask representing Dataset type.
    As defined in the Synapse REST API:
    <https://rest-docs.synapse.org/rest/GET/column/tableview/defaults.html>"""

    def __post_init__(self):
        self.columns = self._convert_columns_to_ordered_dict(columns=self.columns)

    @property
    def has_changed(self) -> bool:
        """Determines if the object has been changed and needs to be updated in Synapse."""
        return (
            not self._last_persistent_instance
            or self._last_persistent_instance != self
            or (not self._last_persistent_instance.items and self.items)
            or self._last_persistent_instance.items != self.items
        )

    def _set_last_persistent_instance(self) -> None:
        """Stash the last time this object interacted with Synapse. This is used to
        determine if the object has been changed and needs to be updated in Synapse."""
        del self._last_persistent_instance
        self._last_persistent_instance = dataclasses.replace(self)
        self._last_persistent_instance.activity = (
            dataclasses.replace(self.activity) if self.activity else None
        )
        self._last_persistent_instance.columns = (
            OrderedDict(
                (key, dataclasses.replace(column))
                for key, column in self.columns.items()
            )
            if self.columns
            else OrderedDict()
        )
        self._last_persistent_instance.annotations = (
            deepcopy(self.annotations) if self.annotations else {}
        )
        self._last_persistent_instance.items = (
            [dataclasses.replace(item) for item in self.items] if self.items else []
        )

    def fill_from_dict(self, entity, set_annotations: bool = True) -> "Self":
        """
        Converts the data coming from the Synapse API into this datamodel.

        Arguments:
            synapse_table: The data coming from the Synapse API

        Returns:
            The Dataset object instance.
        """
        self.id = entity.get("id", None)
        self.name = entity.get("name", None)
        self.description = entity.get("description", None)
        self.parent_id = entity.get("parentId", None)
        self.etag = entity.get("etag", None)
        self.created_on = entity.get("createdOn", None)
        self.created_by = entity.get("createdBy", None)
        self.modified_on = entity.get("modifiedOn", None)
        self.modified_by = entity.get("modifiedBy", None)
        self.version_number = entity.get("versionNumber", None)
        self.version_label = entity.get("versionLabel", None)
        self.version_comment = entity.get("versionComment", None)
        self.is_latest_version = entity.get("isLatestVersion", None)
        self.is_search_enabled = entity.get("isSearchEnabled", False)
        self.size = entity.get("size", None)
        self.checksum = entity.get("checksum", None)
        self.count = entity.get("count", None)
        self.items = [
            EntityRef(id=item["entityId"], version=item["versionNumber"])
            for item in entity.get("items", [])
        ]

        if set_annotations:
            self.annotations = Annotations.from_dict(entity.get("annotations", {}))
        return self

    def to_synapse_request(self):
        """Converts the request to a request expected of the Synapse REST API."""

        entity = {
            "name": self.name,
            "description": self.description,
            "id": self.id,
            "etag": self.etag,
            "createdOn": self.created_on,
            "modifiedOn": self.modified_on,
            "createdBy": self.created_by,
            "modifiedBy": self.modified_by,
            "parentId": self.parent_id,
            "concreteType": concrete_types.DATASET_ENTITY,
            "versionNumber": self.version_number,
            "versionLabel": self.version_label,
            "versionComment": self.version_comment,
            "isLatestVersion": self.is_latest_version,
            "columnIds": (
                [
                    column.id
                    for column in self._last_persistent_instance.columns.values()
                ]
                if self._last_persistent_instance
                and self._last_persistent_instance.columns
                else []
            ),
            "isSearchEnabled": self.is_search_enabled,
            "items": (
                [item.to_synapse_request() for item in self.items] if self.items else []
            ),
            "size": self.size,
            "checksum": self.checksum,
            "count": self.count,
        }
        delete_none_keys(entity)
        result = {
            "entity": entity,
        }
        delete_none_keys(result)
        return result

    def _append_entity_ref(self, entity_ref: EntityRef) -> None:
        """Helper function to add an EntityRef to the items list of the dataset.
        Will not add duplicates.

        Arguments:
            entity_ref: The EntityRef to add to the items list of the dataset.
        """
        if entity_ref not in self.items:
            self.items.append(entity_ref)

    def add_item(
        self,
        item: Union[EntityRef, "File", "Folder"],
        *,
        synapse_client: Optional[Synapse] = None,
    ) -> None:
        """Adds an item in the form of an EntityRef to the dataset.
        For Folders, children are added recursively. Effect is not seen
        until the dataset is stored.

        Arguments:
            item: Entity to add to the dataset. Must be an EntityRef, File, or Folder.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Raises:
            ValueError: If the item is not an EntityRef, File, or Folder

        Example: Add a file to a dataset.
            &nbsp;

            ```python
            from synapseclient import Synapse
            from synapseclient.models import Dataset, File

            syn = Synapse()
            syn.login()

            my_dataset = Dataset(id="syn1234").get()
            my_dataset.add_item(File(id="syn1235"))
            my_dataset.store()
            ```

        Example: Add a folder to a dataset.
            All child files are recursively added to the dataset.

            ```python
            from synapseclient import Synapse
            from synapseclient.models import Dataset, Folder

            syn = Synapse()
            syn.login()

            my_dataset = Dataset(id="syn1234").get()
            my_dataset.add_item(Folder(id="syn1236"))
            my_dataset.store()
            ```

        Example: Add an entity reference to a dataset.
            &nbsp;

            ```python
            from synapseclient import Synapse
            from synapseclient.models import Dataset, EntityRef

            syn = Synapse()
            syn.login()

            my_dataset = Dataset(id="syn1234").get()
            my_dataset.add_item(EntityRef(id="syn1237", version=1))
            my_dataset.store()
            ```
        """
        from synapseclient.models import File, Folder

        client = Synapse.get_client(synapse_client=synapse_client)

        if isinstance(item, EntityRef):
            self._append_entity_ref(entity_ref=item)
        elif isinstance(item, File):
            if not item.version_number:
                item = File(
                    id=item.id, version_number=item.version_number, download_file=False
                ).get()
            self._append_entity_ref(
                entity_ref=EntityRef(id=item.id, version=item.version_number)
            )
        elif isinstance(item, Folder):
            children = wrap_async_to_sync(item._retrieve_children(follow_link=True))
            for child in children:
                if child["type"] == concrete_types.FILE_ENTITY:
                    self._append_entity_ref(
                        entity_ref=EntityRef(
                            id=child["id"], version=child["versionNumber"]
                        )
                    )
                else:
                    self.add_item(item=Folder(id=child["id"]), synapse_client=client)
        else:
            raise ValueError(
                f"item must be one of EntityRef, File, or Folder. {item} is a {type(item)}"
            )

    def _remove_entity_ref(self, entity_ref: EntityRef) -> None:
        """Helper function to remove an EntityRef from the items list of the dataset.

        Arguments:
            entity_ref: The EntityRef to remove from the items list of the dataset.
        """
        if entity_ref not in self.items:
            raise ValueError(f"Entity {entity_ref.id} not found in items list")
        self.items.remove(entity_ref)

    def remove_item(
        self,
        item: Union[EntityRef, "File", "Folder"],
        *,
        synapse_client: Optional[Synapse] = None,
    ) -> None:
        """
        Removes an item from the dataset. For Folders, all
        children of the folder are removed recursively.
        Effect is not seen until the dataset is stored.

        Arguments:
            item: The Synapse ID or Entity to remove from the dataset
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            None

        Raises:
            ValueError: If the item is not a valid type

        Example: Remove a file from a dataset.
            &nbsp;

            ```python
            from synapseclient import Synapse
            from synapseclient.models import Dataset, File

            syn = Synapse()
            syn.login()

            my_dataset = Dataset(id="syn1234").get()
            my_dataset.remove_item(File(id="syn1235"))
            my_dataset.store()
            ```

        Example: Remove a folder from a dataset.
            All child files are recursively removed from the dataset.

            ```python
            from synapseclient import Synapse
            from synapseclient.models import Dataset, Folder

            syn = Synapse()
            syn.login()

            my_dataset = Dataset(id="syn1234").get()
            my_dataset.remove_item(Folder(id="syn1236"))
            my_dataset.store()
            ```

        Example: Remove an entity reference from a dataset.
            &nbsp;
            ```python
            from synapseclient import Synapse
            from synapseclient.models import Dataset, EntityRef

            syn = Synapse()
            syn.login()

            my_dataset = Dataset(id="syn1234").get()
            my_dataset.remove_item(EntityRef(id="syn1237", version=1))
            my_dataset.store()
            ```
        """
        from synapseclient.models import File, Folder

        client = Synapse.get_client(synapse_client=synapse_client)

        if isinstance(item, EntityRef):
            self._remove_entity_ref(item)
        elif isinstance(item, File):
            if not item.version_number:
                item = File(
                    id=item.id, version_number=item.version_number, download_file=False
                ).get()
            self._remove_entity_ref(EntityRef(id=item.id, version=item.version_number))
        elif isinstance(item, Folder):
            children = wrap_async_to_sync(item._retrieve_children(follow_link=True))
            for child in children:
                if child["type"] == concrete_types.FILE_ENTITY:
                    self._remove_entity_ref(
                        EntityRef(id=child["id"], version=child["versionNumber"])
                    )
                else:
                    self.remove_item(item=Folder(id=child["id"]), synapse_client=client)
        else:
            raise ValueError(
                f"item must be one of str, EntityRef, File, or Folder, {item} is a {type(item)}"
            )

    async def store_async(
        self,
        dry_run: bool = False,
        *,
        job_timeout: int = 600,
        synapse_client: Optional[Synapse] = None,
    ) -> "Self":
        """Store information about a Dataset including the columns and annotations.
        Storing an update to the Dataset items will alter the rows present in the Dataset.
        Datasets have default columns that are managed by Synapse. The default behavior of
        this function is to include these default columns in the dataset when it is stored.
        This means that with the default behavior, any columns that you have added to your
        Dataset will be overwritten by the default columns if they have the same name. To
        avoid this behavior, set the `include_default_columns` attribute to `False`.

        Note the following behavior for the order of columns:

        - If a column is added via the `add_column` method it will be added at the
            index you specify, or at the end of the columns list.
        - If column(s) are added during the construction of your Dataset instance, ie.
            `Dataset(columns=[Column(name="foo")])`, they will be added at the beginning
            of the columns list.
        - If you use the `store_rows` method and the `schema_storage_strategy` is set to
            `INFER_FROM_DATA` the columns will be added at the end of the columns list.

        Arguments:
            dry_run: If True, will not actually store the table but will log to
                the console what would have been stored.
            job_timeout: The maximum amount of time to wait for a job to complete.
                This is used when updating the table schema. If the timeout
                is reached a `SynapseTimeoutError` will be raised.
                The default is 600 seconds
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            The Dataset instance stored in synapse.

        Example: Create a new dataset from a list of EntityRefs by storing it.
            &nbsp;
            ```python
            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import Dataset, EntityRef

            syn = Synapse()
            syn.login()

            async def main():
                my_entity_refs = [EntityRef(id="syn1234"), EntityRef(id="syn1235"), EntityRef(id="syn1236")]
                my_dataset = Dataset(parent_id="syn987", name="my-new-dataset", items=my_entity_refs)
                await my_dataset.store_async()

            asyncio.run(main())
            ```
        """
        return await super().store_async(
            dry_run=dry_run,
            job_timeout=job_timeout,
            synapse_client=synapse_client,
        )

    async def get_async(
        self,
        include_columns: bool = True,
        include_activity: bool = False,
        *,
        synapse_client: Optional[Synapse] = None,
    ) -> "Self":
        """Get the metadata about the Dataset from synapse.

        Arguments:
            include_columns: If True, will include fully filled column objects in the
                `.columns` attribute. Defaults to True.
            include_activity: If True the activity will be included in the Dataset
                if it exists. Defaults to False.

            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            The Dataset instance stored in synapse.

        Example: Getting metadata about a Dataset using id
            Get a Dataset by ID and print out the columns and activity. `include_columns`
            defaults to True and `include_activity` defaults to False. When you need to
            update existing columns or activity these need to be set to True during the
            `get_async` call, then you'll make the changes, and finally call the
            `.store_async()` method.

            ```python
            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import Dataset

            syn = Synapse()
            syn.login()

            async def main():
                dataset = await Dataset(id="syn4567").get_async(include_activity=True)
                print(dataset)

                # Columns are retrieved by default
                print(dataset.columns)
                print(dataset.activity)

            asyncio.run(main())
            ```

        Example: Getting metadata about a Dataset using name and parent_id
            Get a Dataset by name/parent_id and print out the columns and activity.
            `include_columns` defaults to True and `include_activity` defaults to
            False. When you need to update existing columns or activity these need to
            be set to True during the `get_async` call, then you'll make the changes,
            and finally call the `.store_async()` method.

            ```python
            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import Dataset

            syn = Synapse()
            syn.login()

            async def main():
                dataset = await Dataset(
                    name="my_dataset",
                    parent_id="syn1234"
                ).get_async(
                    include_columns=True,
                    include_activity=True
                )
                print(dataset)
                print(dataset.columns)
                print(dataset.activity)

            asyncio.run(main())
            ```
        """
        return await super().get_async(
            include_columns=include_columns,
            include_activity=include_activity,
            synapse_client=synapse_client,
        )

    async def delete_async(self, *, synapse_client: Optional[Synapse] = None) -> None:
        """Delete the dataset from synapse. This is not version specific. If you'd like
        to delete a specific version of the dataset you must use the
        [synapseclient.api.delete_entity][] function directly.

        Arguments:
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            None

        Example: Deleting a dataset
            Deleting a dataset is only supported by the ID of the dataset.

            ```python
            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import Dataset

            syn = Synapse()
            syn.login()

            async def main():
                await Dataset(id="syn4567").delete_async()

            asyncio.run(main())
            ```
        """
        await super().delete_async(synapse_client=synapse_client)

    async def update_rows_async(
        self,
        values: DATA_FRAME_TYPE,
        primary_keys: List[str],
        dry_run: bool = False,
        *,
        rows_per_query: int = 50000,
        update_size_bytes: int = 1.9 * MB,
        insert_size_bytes: int = 900 * MB,
        job_timeout: int = 600,
        wait_for_eventually_consistent_view: bool = False,
        wait_for_eventually_consistent_view_timeout: int = 600,
        synapse_client: Optional[Synapse] = None,
        **kwargs,
    ) -> None:
        """Update the values of rows in the dataset. This method can only
        be used to update values in custom columns. Default columns cannot be updated, but
        may be used as primary keys.

        Limitations:

        - When updating many rows the requests to Synapse will be chunked into smaller
            requests. The limit is 2MB per request. This chunking will happen
            automatically and should not be a concern for most users. If you are
            having issues with the request being too large you may lower the
            number of rows you are trying to update.
        - The `primary_keys` argument must contain at least one column.
        - The `primary_keys` argument cannot contain columns that are a LIST type.
        - The `primary_keys` argument cannot contain columns that are a JSON type.
        - The values used as the `primary_keys` must be unique in the table. If there
            are multiple rows with the same values in the `primary_keys` the behavior
            is that an exception will be raised.
        - The columns used in `primary_keys` cannot contain updated values. Since
            the values in these columns are used to determine if a row exists, they
            cannot be updated in the same transaction.

        Arguments:
            values: Supports storing data from the following sources:

                - A string holding the path to a CSV file. The data will be read into a
                    [Pandas DataFrame](http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe).
                    The code makes assumptions about the format of the columns in the
                    CSV as detailed in the [csv_to_pandas_df][synapseclient.models.mixins.table_components.csv_to_pandas_df]
                    function. You may pass in additional arguments to the `csv_to_pandas_df`
                    function by passing them in as keyword arguments to this function.
                - A dictionary where the key is the column name and the value is one or
                    more values. The values will be wrapped into a [Pandas DataFrame](http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe). You may pass in additional arguments to the `pd.DataFrame` function by passing them in as keyword arguments to this function. Read about the available arguments in the [Pandas DataFrame](https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.html) documentation.
                - A [Pandas DataFrame](http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe)

            primary_keys: The columns to use to determine if a row already exists. If
                a row exists with the same values in the columns specified in this list
                the row will be updated. If a row does not exist nothing will be done.

            dry_run: If set to True the data will not be updated in Synapse. A message
                will be printed to the console with the number of rows that would have
                been updated and inserted. If you would like to see the data that would
                be updated and inserted you may set the `dry_run` argument to True and
                set the log level to DEBUG by setting the debug flag when creating
                your Synapse class instance like: `syn = Synapse(debug=True)`.

            rows_per_query: The number of rows that will be queried from Synapse per
                request. Since we need to query for the data that is being updated
                this will determine the number of rows that are queried at a time.
                The default is 50,000 rows.

            update_size_bytes: The maximum size of the request that will be sent to Synapse
                when updating rows of data. The default is 1.9MB.

            insert_size_bytes: The maximum size of the request that will be sent to Synapse
                when inserting rows of data. The default is 900MB.

            job_timeout: The maximum amount of time to wait for a job to complete.
                This is used when inserting, and updating rows of data. Each individual
                request to Synapse will be sent as an independent job. If the timeout
                is reached a `SynapseTimeoutError` will be raised.
                The default is 600 seconds

            wait_for_eventually_consistent_view: Only used if the table is a view. If
                set to True this will wait for the view to reflect any changes that
                you've made to the view. This is useful if you need to query the view
                after making changes to the data.

            wait_for_eventually_consistent_view_timeout: The maximum amount of time to
                wait for a view to be eventually consistent. The default is 600 seconds.

            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor

            **kwargs: Additional arguments that are passed to the `pd.DataFrame`
                function when the `values` argument is a path to a csv file.


        Example: Update custom column values in a dataset.
            &nbsp;

            ```python
            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import Dataset
            import pandas as pd

            syn = Synapse()
            syn.login()

            async def main():
                my_dataset = await Dataset(id="syn1234").get_async()

                # my_annotation must already exist in the dataset as a custom column
                modified_data = pd.DataFrame(
                    {"id": ["syn1234"], "my_annotation": ["good data"]}
                )
                await my_dataset.update_rows_async(values=modified_data, primary_keys=["id"], dry_run=False)

            asyncio.run(main())
            ```
        """
        await super().update_rows_async(
            values=values,
            primary_keys=primary_keys,
            dry_run=dry_run,
            rows_per_query=rows_per_query,
            update_size_bytes=update_size_bytes,
            insert_size_bytes=insert_size_bytes,
            job_timeout=job_timeout,
            wait_for_eventually_consistent_view=wait_for_eventually_consistent_view,
            wait_for_eventually_consistent_view_timeout=wait_for_eventually_consistent_view_timeout,
            synapse_client=synapse_client,
            **kwargs,
        )

    async def snapshot_async(
        self,
        *,
        comment: Optional[str] = None,
        label: Optional[str] = None,
        include_activity: bool = True,
        associate_activity_to_new_version: bool = True,
        synapse_client: Optional[Synapse] = None,
    ) -> "TableUpdateTransaction":
        """Creates a snapshot of the dataset. A snapshot is a saved, read-only version of the dataset
        at the time it was created. Dataset snapshots are created using the asyncronous job API.

        Arguments:
            comment: A unique comment to associate with the snapshot.
            label: A unique label to associate with the snapshot.
            include_activity: If True the activity will be included in snapshot if it
                exists. In order to include the activity, the activity must have already
                been stored in Synapse by using the `activity` attribute on the Dataset
                and calling the `store()` method on the Dataset instance. Adding an
                activity to a snapshot of a dataset is meant to capture the provenance of
                the data at the time of the snapshot. Defaults to True.
            associate_activity_to_new_version: If True the activity will be associated
                with the new version of the dataset. If False the activity will not be
                associated with the new version of the dataset. Defaults to True.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            A `TableUpdateTransaction` object which includes the version number of the snapshot.

        Example: Save a snapshot of a dataset.
            &nbsp;

            ```python
            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import Dataset

            syn = Synapse()
            syn.login()

            async def main():
                my_dataset = await Dataset(id="syn1234").get_async()
                await my_dataset.snapshot_async(comment="My first snapshot", label="My first snapshot")

            asyncio.run(main())
            ```
        """
        return await super().snapshot_async(
            comment=comment,
            label=label,
            include_activity=include_activity,
            associate_activity_to_new_version=associate_activity_to_new_version,
            synapse_client=synapse_client,
        )

Functions

store_async async

store_async(dry_run: bool = False, *, job_timeout: int = 600, synapse_client: Optional[Synapse] = None) -> Self

Store information about a Dataset including the columns and annotations. Storing an update to the Dataset items will alter the rows present in the Dataset. Datasets have default columns that are managed by Synapse. The default behavior of this function is to include these default columns in the dataset when it is stored. This means that with the default behavior, any columns that you have added to your Dataset will be overwritten by the default columns if they have the same name. To avoid this behavior, set the include_default_columns attribute to False.

Note the following behavior for the order of columns:

  • If a column is added via the add_column method it will be added at the index you specify, or at the end of the columns list.
  • If column(s) are added during the construction of your Dataset instance, ie. Dataset(columns=[Column(name="foo")]), they will be added at the beginning of the columns list.
  • If you use the store_rows method and the schema_storage_strategy is set to INFER_FROM_DATA the columns will be added at the end of the columns list.
PARAMETER DESCRIPTION
dry_run

If True, will not actually store the table but will log to the console what would have been stored.

TYPE: bool DEFAULT: False

job_timeout

The maximum amount of time to wait for a job to complete. This is used when updating the table schema. If the timeout is reached a SynapseTimeoutError will be raised. The default is 600 seconds

TYPE: int DEFAULT: 600

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
Self

The Dataset instance stored in synapse.

Create a new dataset from a list of EntityRefs by storing it.

 

import asyncio
from synapseclient import Synapse
from synapseclient.models import Dataset, EntityRef

syn = Synapse()
syn.login()

async def main():
    my_entity_refs = [EntityRef(id="syn1234"), EntityRef(id="syn1235"), EntityRef(id="syn1236")]
    my_dataset = Dataset(parent_id="syn987", name="my-new-dataset", items=my_entity_refs)
    await my_dataset.store_async()

asyncio.run(main())

Source code in synapseclient/models/dataset.py
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
async def store_async(
    self,
    dry_run: bool = False,
    *,
    job_timeout: int = 600,
    synapse_client: Optional[Synapse] = None,
) -> "Self":
    """Store information about a Dataset including the columns and annotations.
    Storing an update to the Dataset items will alter the rows present in the Dataset.
    Datasets have default columns that are managed by Synapse. The default behavior of
    this function is to include these default columns in the dataset when it is stored.
    This means that with the default behavior, any columns that you have added to your
    Dataset will be overwritten by the default columns if they have the same name. To
    avoid this behavior, set the `include_default_columns` attribute to `False`.

    Note the following behavior for the order of columns:

    - If a column is added via the `add_column` method it will be added at the
        index you specify, or at the end of the columns list.
    - If column(s) are added during the construction of your Dataset instance, ie.
        `Dataset(columns=[Column(name="foo")])`, they will be added at the beginning
        of the columns list.
    - If you use the `store_rows` method and the `schema_storage_strategy` is set to
        `INFER_FROM_DATA` the columns will be added at the end of the columns list.

    Arguments:
        dry_run: If True, will not actually store the table but will log to
            the console what would have been stored.
        job_timeout: The maximum amount of time to wait for a job to complete.
            This is used when updating the table schema. If the timeout
            is reached a `SynapseTimeoutError` will be raised.
            The default is 600 seconds
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        The Dataset instance stored in synapse.

    Example: Create a new dataset from a list of EntityRefs by storing it.
        &nbsp;
        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Dataset, EntityRef

        syn = Synapse()
        syn.login()

        async def main():
            my_entity_refs = [EntityRef(id="syn1234"), EntityRef(id="syn1235"), EntityRef(id="syn1236")]
            my_dataset = Dataset(parent_id="syn987", name="my-new-dataset", items=my_entity_refs)
            await my_dataset.store_async()

        asyncio.run(main())
        ```
    """
    return await super().store_async(
        dry_run=dry_run,
        job_timeout=job_timeout,
        synapse_client=synapse_client,
    )

get_async async

get_async(include_columns: bool = True, include_activity: bool = False, *, synapse_client: Optional[Synapse] = None) -> Self

Get the metadata about the Dataset from synapse.

PARAMETER DESCRIPTION
include_columns

If True, will include fully filled column objects in the .columns attribute. Defaults to True.

TYPE: bool DEFAULT: True

include_activity

If True the activity will be included in the Dataset if it exists. Defaults to False.

TYPE: bool DEFAULT: False

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
Self

The Dataset instance stored in synapse.

Getting metadata about a Dataset using id

Get a Dataset by ID and print out the columns and activity. include_columns defaults to True and include_activity defaults to False. When you need to update existing columns or activity these need to be set to True during the get_async call, then you'll make the changes, and finally call the .store_async() method.

import asyncio
from synapseclient import Synapse
from synapseclient.models import Dataset

syn = Synapse()
syn.login()

async def main():
    dataset = await Dataset(id="syn4567").get_async(include_activity=True)
    print(dataset)

    # Columns are retrieved by default
    print(dataset.columns)
    print(dataset.activity)

asyncio.run(main())
Getting metadata about a Dataset using name and parent_id

Get a Dataset by name/parent_id and print out the columns and activity. include_columns defaults to True and include_activity defaults to False. When you need to update existing columns or activity these need to be set to True during the get_async call, then you'll make the changes, and finally call the .store_async() method.

import asyncio
from synapseclient import Synapse
from synapseclient.models import Dataset

syn = Synapse()
syn.login()

async def main():
    dataset = await Dataset(
        name="my_dataset",
        parent_id="syn1234"
    ).get_async(
        include_columns=True,
        include_activity=True
    )
    print(dataset)
    print(dataset.columns)
    print(dataset.activity)

asyncio.run(main())
Source code in synapseclient/models/dataset.py
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
async def get_async(
    self,
    include_columns: bool = True,
    include_activity: bool = False,
    *,
    synapse_client: Optional[Synapse] = None,
) -> "Self":
    """Get the metadata about the Dataset from synapse.

    Arguments:
        include_columns: If True, will include fully filled column objects in the
            `.columns` attribute. Defaults to True.
        include_activity: If True the activity will be included in the Dataset
            if it exists. Defaults to False.

        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        The Dataset instance stored in synapse.

    Example: Getting metadata about a Dataset using id
        Get a Dataset by ID and print out the columns and activity. `include_columns`
        defaults to True and `include_activity` defaults to False. When you need to
        update existing columns or activity these need to be set to True during the
        `get_async` call, then you'll make the changes, and finally call the
        `.store_async()` method.

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Dataset

        syn = Synapse()
        syn.login()

        async def main():
            dataset = await Dataset(id="syn4567").get_async(include_activity=True)
            print(dataset)

            # Columns are retrieved by default
            print(dataset.columns)
            print(dataset.activity)

        asyncio.run(main())
        ```

    Example: Getting metadata about a Dataset using name and parent_id
        Get a Dataset by name/parent_id and print out the columns and activity.
        `include_columns` defaults to True and `include_activity` defaults to
        False. When you need to update existing columns or activity these need to
        be set to True during the `get_async` call, then you'll make the changes,
        and finally call the `.store_async()` method.

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Dataset

        syn = Synapse()
        syn.login()

        async def main():
            dataset = await Dataset(
                name="my_dataset",
                parent_id="syn1234"
            ).get_async(
                include_columns=True,
                include_activity=True
            )
            print(dataset)
            print(dataset.columns)
            print(dataset.activity)

        asyncio.run(main())
        ```
    """
    return await super().get_async(
        include_columns=include_columns,
        include_activity=include_activity,
        synapse_client=synapse_client,
    )

delete_async async

delete_async(*, synapse_client: Optional[Synapse] = None) -> None

Delete the dataset from synapse. This is not version specific. If you'd like to delete a specific version of the dataset you must use the synapseclient.api.delete_entity function directly.

PARAMETER DESCRIPTION
synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
None

None

Deleting a dataset

Deleting a dataset is only supported by the ID of the dataset.

import asyncio
from synapseclient import Synapse
from synapseclient.models import Dataset

syn = Synapse()
syn.login()

async def main():
    await Dataset(id="syn4567").delete_async()

asyncio.run(main())
Source code in synapseclient/models/dataset.py
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
async def delete_async(self, *, synapse_client: Optional[Synapse] = None) -> None:
    """Delete the dataset from synapse. This is not version specific. If you'd like
    to delete a specific version of the dataset you must use the
    [synapseclient.api.delete_entity][] function directly.

    Arguments:
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        None

    Example: Deleting a dataset
        Deleting a dataset is only supported by the ID of the dataset.

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Dataset

        syn = Synapse()
        syn.login()

        async def main():
            await Dataset(id="syn4567").delete_async()

        asyncio.run(main())
        ```
    """
    await super().delete_async(synapse_client=synapse_client)

update_rows_async async

update_rows_async(values: DATA_FRAME_TYPE, primary_keys: List[str], dry_run: bool = False, *, rows_per_query: int = 50000, update_size_bytes: int = 1.9 * MB, insert_size_bytes: int = 900 * MB, job_timeout: int = 600, wait_for_eventually_consistent_view: bool = False, wait_for_eventually_consistent_view_timeout: int = 600, synapse_client: Optional[Synapse] = None, **kwargs) -> None

Update the values of rows in the dataset. This method can only be used to update values in custom columns. Default columns cannot be updated, but may be used as primary keys.

Limitations:

  • When updating many rows the requests to Synapse will be chunked into smaller requests. The limit is 2MB per request. This chunking will happen automatically and should not be a concern for most users. If you are having issues with the request being too large you may lower the number of rows you are trying to update.
  • The primary_keys argument must contain at least one column.
  • The primary_keys argument cannot contain columns that are a LIST type.
  • The primary_keys argument cannot contain columns that are a JSON type.
  • The values used as the primary_keys must be unique in the table. If there are multiple rows with the same values in the primary_keys the behavior is that an exception will be raised.
  • The columns used in primary_keys cannot contain updated values. Since the values in these columns are used to determine if a row exists, they cannot be updated in the same transaction.
PARAMETER DESCRIPTION
values

Supports storing data from the following sources:

  • A string holding the path to a CSV file. The data will be read into a Pandas DataFrame. The code makes assumptions about the format of the columns in the CSV as detailed in the csv_to_pandas_df function. You may pass in additional arguments to the csv_to_pandas_df function by passing them in as keyword arguments to this function.
  • A dictionary where the key is the column name and the value is one or more values. The values will be wrapped into a Pandas DataFrame. You may pass in additional arguments to the pd.DataFrame function by passing them in as keyword arguments to this function. Read about the available arguments in the Pandas DataFrame documentation.
  • A Pandas DataFrame

TYPE: DATA_FRAME_TYPE

primary_keys

The columns to use to determine if a row already exists. If a row exists with the same values in the columns specified in this list the row will be updated. If a row does not exist nothing will be done.

TYPE: List[str]

dry_run

If set to True the data will not be updated in Synapse. A message will be printed to the console with the number of rows that would have been updated and inserted. If you would like to see the data that would be updated and inserted you may set the dry_run argument to True and set the log level to DEBUG by setting the debug flag when creating your Synapse class instance like: syn = Synapse(debug=True).

TYPE: bool DEFAULT: False

rows_per_query

The number of rows that will be queried from Synapse per request. Since we need to query for the data that is being updated this will determine the number of rows that are queried at a time. The default is 50,000 rows.

TYPE: int DEFAULT: 50000

update_size_bytes

The maximum size of the request that will be sent to Synapse when updating rows of data. The default is 1.9MB.

TYPE: int DEFAULT: 1.9 * MB

insert_size_bytes

The maximum size of the request that will be sent to Synapse when inserting rows of data. The default is 900MB.

TYPE: int DEFAULT: 900 * MB

job_timeout

The maximum amount of time to wait for a job to complete. This is used when inserting, and updating rows of data. Each individual request to Synapse will be sent as an independent job. If the timeout is reached a SynapseTimeoutError will be raised. The default is 600 seconds

TYPE: int DEFAULT: 600

wait_for_eventually_consistent_view

Only used if the table is a view. If set to True this will wait for the view to reflect any changes that you've made to the view. This is useful if you need to query the view after making changes to the data.

TYPE: bool DEFAULT: False

wait_for_eventually_consistent_view_timeout

The maximum amount of time to wait for a view to be eventually consistent. The default is 600 seconds.

TYPE: int DEFAULT: 600

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor

TYPE: Optional[Synapse] DEFAULT: None

**kwargs

Additional arguments that are passed to the pd.DataFrame function when the values argument is a path to a csv file.

DEFAULT: {}

Update custom column values in a dataset.

 

import asyncio
from synapseclient import Synapse
from synapseclient.models import Dataset
import pandas as pd

syn = Synapse()
syn.login()

async def main():
    my_dataset = await Dataset(id="syn1234").get_async()

    # my_annotation must already exist in the dataset as a custom column
    modified_data = pd.DataFrame(
        {"id": ["syn1234"], "my_annotation": ["good data"]}
    )
    await my_dataset.update_rows_async(values=modified_data, primary_keys=["id"], dry_run=False)

asyncio.run(main())
Source code in synapseclient/models/dataset.py
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
async def update_rows_async(
    self,
    values: DATA_FRAME_TYPE,
    primary_keys: List[str],
    dry_run: bool = False,
    *,
    rows_per_query: int = 50000,
    update_size_bytes: int = 1.9 * MB,
    insert_size_bytes: int = 900 * MB,
    job_timeout: int = 600,
    wait_for_eventually_consistent_view: bool = False,
    wait_for_eventually_consistent_view_timeout: int = 600,
    synapse_client: Optional[Synapse] = None,
    **kwargs,
) -> None:
    """Update the values of rows in the dataset. This method can only
    be used to update values in custom columns. Default columns cannot be updated, but
    may be used as primary keys.

    Limitations:

    - When updating many rows the requests to Synapse will be chunked into smaller
        requests. The limit is 2MB per request. This chunking will happen
        automatically and should not be a concern for most users. If you are
        having issues with the request being too large you may lower the
        number of rows you are trying to update.
    - The `primary_keys` argument must contain at least one column.
    - The `primary_keys` argument cannot contain columns that are a LIST type.
    - The `primary_keys` argument cannot contain columns that are a JSON type.
    - The values used as the `primary_keys` must be unique in the table. If there
        are multiple rows with the same values in the `primary_keys` the behavior
        is that an exception will be raised.
    - The columns used in `primary_keys` cannot contain updated values. Since
        the values in these columns are used to determine if a row exists, they
        cannot be updated in the same transaction.

    Arguments:
        values: Supports storing data from the following sources:

            - A string holding the path to a CSV file. The data will be read into a
                [Pandas DataFrame](http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe).
                The code makes assumptions about the format of the columns in the
                CSV as detailed in the [csv_to_pandas_df][synapseclient.models.mixins.table_components.csv_to_pandas_df]
                function. You may pass in additional arguments to the `csv_to_pandas_df`
                function by passing them in as keyword arguments to this function.
            - A dictionary where the key is the column name and the value is one or
                more values. The values will be wrapped into a [Pandas DataFrame](http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe). You may pass in additional arguments to the `pd.DataFrame` function by passing them in as keyword arguments to this function. Read about the available arguments in the [Pandas DataFrame](https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.html) documentation.
            - A [Pandas DataFrame](http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe)

        primary_keys: The columns to use to determine if a row already exists. If
            a row exists with the same values in the columns specified in this list
            the row will be updated. If a row does not exist nothing will be done.

        dry_run: If set to True the data will not be updated in Synapse. A message
            will be printed to the console with the number of rows that would have
            been updated and inserted. If you would like to see the data that would
            be updated and inserted you may set the `dry_run` argument to True and
            set the log level to DEBUG by setting the debug flag when creating
            your Synapse class instance like: `syn = Synapse(debug=True)`.

        rows_per_query: The number of rows that will be queried from Synapse per
            request. Since we need to query for the data that is being updated
            this will determine the number of rows that are queried at a time.
            The default is 50,000 rows.

        update_size_bytes: The maximum size of the request that will be sent to Synapse
            when updating rows of data. The default is 1.9MB.

        insert_size_bytes: The maximum size of the request that will be sent to Synapse
            when inserting rows of data. The default is 900MB.

        job_timeout: The maximum amount of time to wait for a job to complete.
            This is used when inserting, and updating rows of data. Each individual
            request to Synapse will be sent as an independent job. If the timeout
            is reached a `SynapseTimeoutError` will be raised.
            The default is 600 seconds

        wait_for_eventually_consistent_view: Only used if the table is a view. If
            set to True this will wait for the view to reflect any changes that
            you've made to the view. This is useful if you need to query the view
            after making changes to the data.

        wait_for_eventually_consistent_view_timeout: The maximum amount of time to
            wait for a view to be eventually consistent. The default is 600 seconds.

        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor

        **kwargs: Additional arguments that are passed to the `pd.DataFrame`
            function when the `values` argument is a path to a csv file.


    Example: Update custom column values in a dataset.
        &nbsp;

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Dataset
        import pandas as pd

        syn = Synapse()
        syn.login()

        async def main():
            my_dataset = await Dataset(id="syn1234").get_async()

            # my_annotation must already exist in the dataset as a custom column
            modified_data = pd.DataFrame(
                {"id": ["syn1234"], "my_annotation": ["good data"]}
            )
            await my_dataset.update_rows_async(values=modified_data, primary_keys=["id"], dry_run=False)

        asyncio.run(main())
        ```
    """
    await super().update_rows_async(
        values=values,
        primary_keys=primary_keys,
        dry_run=dry_run,
        rows_per_query=rows_per_query,
        update_size_bytes=update_size_bytes,
        insert_size_bytes=insert_size_bytes,
        job_timeout=job_timeout,
        wait_for_eventually_consistent_view=wait_for_eventually_consistent_view,
        wait_for_eventually_consistent_view_timeout=wait_for_eventually_consistent_view_timeout,
        synapse_client=synapse_client,
        **kwargs,
    )

snapshot_async async

snapshot_async(*, comment: Optional[str] = None, label: Optional[str] = None, include_activity: bool = True, associate_activity_to_new_version: bool = True, synapse_client: Optional[Synapse] = None) -> TableUpdateTransaction

Creates a snapshot of the dataset. A snapshot is a saved, read-only version of the dataset at the time it was created. Dataset snapshots are created using the asyncronous job API.

PARAMETER DESCRIPTION
comment

A unique comment to associate with the snapshot.

TYPE: Optional[str] DEFAULT: None

label

A unique label to associate with the snapshot.

TYPE: Optional[str] DEFAULT: None

include_activity

If True the activity will be included in snapshot if it exists. In order to include the activity, the activity must have already been stored in Synapse by using the activity attribute on the Dataset and calling the store() method on the Dataset instance. Adding an activity to a snapshot of a dataset is meant to capture the provenance of the data at the time of the snapshot. Defaults to True.

TYPE: bool DEFAULT: True

associate_activity_to_new_version

If True the activity will be associated with the new version of the dataset. If False the activity will not be associated with the new version of the dataset. Defaults to True.

TYPE: bool DEFAULT: True

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
TableUpdateTransaction

A TableUpdateTransaction object which includes the version number of the snapshot.

Save a snapshot of a dataset.

 

import asyncio
from synapseclient import Synapse
from synapseclient.models import Dataset

syn = Synapse()
syn.login()

async def main():
    my_dataset = await Dataset(id="syn1234").get_async()
    await my_dataset.snapshot_async(comment="My first snapshot", label="My first snapshot")

asyncio.run(main())
Source code in synapseclient/models/dataset.py
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
async def snapshot_async(
    self,
    *,
    comment: Optional[str] = None,
    label: Optional[str] = None,
    include_activity: bool = True,
    associate_activity_to_new_version: bool = True,
    synapse_client: Optional[Synapse] = None,
) -> "TableUpdateTransaction":
    """Creates a snapshot of the dataset. A snapshot is a saved, read-only version of the dataset
    at the time it was created. Dataset snapshots are created using the asyncronous job API.

    Arguments:
        comment: A unique comment to associate with the snapshot.
        label: A unique label to associate with the snapshot.
        include_activity: If True the activity will be included in snapshot if it
            exists. In order to include the activity, the activity must have already
            been stored in Synapse by using the `activity` attribute on the Dataset
            and calling the `store()` method on the Dataset instance. Adding an
            activity to a snapshot of a dataset is meant to capture the provenance of
            the data at the time of the snapshot. Defaults to True.
        associate_activity_to_new_version: If True the activity will be associated
            with the new version of the dataset. If False the activity will not be
            associated with the new version of the dataset. Defaults to True.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        A `TableUpdateTransaction` object which includes the version number of the snapshot.

    Example: Save a snapshot of a dataset.
        &nbsp;

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Dataset

        syn = Synapse()
        syn.login()

        async def main():
            my_dataset = await Dataset(id="syn1234").get_async()
            await my_dataset.snapshot_async(comment="My first snapshot", label="My first snapshot")

        asyncio.run(main())
        ```
    """
    return await super().snapshot_async(
        comment=comment,
        label=label,
        include_activity=include_activity,
        associate_activity_to_new_version=associate_activity_to_new_version,
        synapse_client=synapse_client,
    )

query_async async staticmethod

query_async(query: str, include_row_id_and_row_version: bool = True, convert_to_datetime: bool = False, download_location=None, quote_character='"', escape_character='\\', line_end=str(linesep), separator=',', header=True, *, synapse_client: Optional[Synapse] = None, **kwargs) -> Union[DATA_FRAME_TYPE, str]

Query for data on a table stored in Synapse. The results will always be returned as a Pandas DataFrame unless you specify a download_location in which case the results will be downloaded to that location. There are a number of arguments that you may pass to this function depending on if you are getting the results back as a DataFrame or downloading the results to a file.

PARAMETER DESCRIPTION
query

The query to run. The query must be valid syntax that Synapse can understand. See this document that describes the expected syntax of the query: https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html

TYPE: str

include_row_id_and_row_version

If True the ROW_ID and ROW_VERSION columns will be returned in the DataFrame. These columns are required if using the query results to update rows in the table. These columns are the primary keys used by Synapse to uniquely identify rows in the table.

TYPE: bool DEFAULT: True

convert_to_datetime

(DataFrame only) If set to True, will convert all Synapse DATE columns from UNIX timestamp integers into UTC datetime objects

TYPE: bool DEFAULT: False

download_location

(CSV Only) If set to a path the results will be downloaded to that directory. The results will be downloaded as a CSV file. A path to the downloaded file will be returned instead of a DataFrame.

DEFAULT: None

quote_character

(CSV Only) The character to use to quote fields. The default is a double quote.

DEFAULT: '"'

escape_character

(CSV Only) The character to use to escape special characters. The default is a backslash.

DEFAULT: '\\'

line_end

(CSV Only) The character to use to end a line. The default is the system's line separator.

DEFAULT: str(linesep)

separator

(CSV Only) The character to use to separate fields. The default is a comma.

DEFAULT: ','

header

(CSV Only) If set to True the first row will be used as the header row. The default is True.

DEFAULT: True

**kwargs

(DataFrame only) Additional keyword arguments to pass to pandas.read_csv. See https://pandas.pydata.org/docs/reference/api/pandas.read_csv.html for complete list of supported arguments. This is exposed as internally the query downloads a CSV from Synapse and then loads it into a dataframe.

DEFAULT: {}

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
Union[DATA_FRAME_TYPE, str]

The results of the query as a Pandas DataFrame or a path to the downloaded

Union[DATA_FRAME_TYPE, str]

query results if download_location is set.

Querying for data

This example shows how you may query for data in a table and print out the results.

import asyncio
from synapseclient import Synapse
from synapseclient.models import query_async

syn = Synapse()
syn.login()

async def main():
    results = await query_async(query="SELECT * FROM syn1234")
    print(results)

asyncio.run(main())
Source code in synapseclient/models/mixins/table_components.py
2624
2625
2626
2627
2628
2629
2630
2631
2632
2633
2634
2635
2636
2637
2638
2639
2640
2641
2642
2643
2644
2645
2646
2647
2648
2649
2650
2651
2652
2653
2654
2655
2656
2657
2658
2659
2660
2661
2662
2663
2664
2665
2666
2667
2668
2669
2670
2671
2672
2673
2674
2675
2676
2677
2678
2679
2680
2681
2682
2683
2684
2685
2686
2687
2688
2689
2690
2691
2692
2693
2694
2695
2696
2697
2698
2699
2700
2701
2702
2703
2704
2705
2706
2707
2708
2709
2710
2711
2712
2713
2714
2715
2716
2717
2718
2719
2720
2721
2722
2723
2724
2725
2726
2727
2728
2729
2730
2731
2732
2733
2734
2735
2736
2737
2738
2739
2740
2741
2742
2743
2744
2745
2746
2747
2748
2749
2750
2751
2752
2753
2754
2755
2756
2757
2758
2759
2760
2761
@staticmethod
async def query_async(
    query: str,
    include_row_id_and_row_version: bool = True,
    convert_to_datetime: bool = False,
    download_location=None,
    quote_character='"',
    escape_character="\\",
    line_end=str(os.linesep),
    separator=",",
    header=True,
    *,
    synapse_client: Optional[Synapse] = None,
    **kwargs,
) -> Union["DATA_FRAME_TYPE", str]:
    """Query for data on a table stored in Synapse. The results will always be
    returned as a Pandas DataFrame unless you specify a `download_location` in which
    case the results will be downloaded to that location. There are a number of
    arguments that you may pass to this function depending on if you are getting
    the results back as a DataFrame or downloading the results to a file.

    Arguments:
        query: The query to run. The query must be valid syntax that Synapse can
            understand. See this document that describes the expected syntax of the
            query:
            <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>
        include_row_id_and_row_version: If True the `ROW_ID` and `ROW_VERSION`
            columns will be returned in the DataFrame. These columns are required
            if using the query results to update rows in the table. These columns
            are the primary keys used by Synapse to uniquely identify rows in the
            table.
        convert_to_datetime: (DataFrame only) If set to True, will convert all
            Synapse DATE columns from UNIX timestamp integers into UTC datetime
            objects

        download_location: (CSV Only) If set to a path the results will be
            downloaded to that directory. The results will be downloaded as a CSV
            file. A path to the downloaded file will be returned instead of a
            DataFrame.

        quote_character: (CSV Only) The character to use to quote fields. The
            default is a double quote.

        escape_character: (CSV Only) The character to use to escape special
            characters. The default is a backslash.

        line_end: (CSV Only) The character to use to end a line. The default is
            the system's line separator.

        separator: (CSV Only) The character to use to separate fields. The default
            is a comma.

        header: (CSV Only) If set to True the first row will be used as the header
            row. The default is True.

        **kwargs: (DataFrame only) Additional keyword arguments to pass to
            pandas.read_csv. See
            <https://pandas.pydata.org/docs/reference/api/pandas.read_csv.html>
            for complete list of supported arguments. This is exposed as
            internally the query downloads a CSV from Synapse and then loads
            it into a dataframe.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        The results of the query as a Pandas DataFrame or a path to the downloaded
        query results if `download_location` is set.

    Example: Querying for data
        This example shows how you may query for data in a table and print out the
        results.

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import query_async

        syn = Synapse()
        syn.login()

        async def main():
            results = await query_async(query="SELECT * FROM syn1234")
            print(results)

        asyncio.run(main())
        ```
    """

    client = Synapse.get_client(synapse_client=synapse_client)

    if client.logger.isEnabledFor(logging.DEBUG):
        client.logger.debug(f"Running query: {query}")

    # TODO: Implementation should not download CSV to disk, instead the ideal
    # solution will load the result into BytesIO and then pass that to
    # pandas.read_csv. During implmentation a determination on how large of a CSV
    # that can be loaded from Memory will be needed. When that limit is reached we
    # should continue to force the download of those results to disk.
    result, csv_path = await _table_query(
        query=query,
        include_row_id_and_row_version=include_row_id_and_row_version,
        quote_char=quote_character,
        escape_char=escape_character,
        line_end=line_end,
        separator=separator,
        header=header,
        download_location=download_location,
    )

    if download_location:
        return csv_path

    date_columns = []
    list_columns = []
    dtype = {}

    if result.headers is not None:
        for column in result.headers:
            if column.column_type == "STRING":
                # we want to identify string columns so that pandas doesn't try to
                # automatically parse strings in a string column to other data types
                dtype[column.name] = str
            elif column.column_type in LIST_COLUMN_TYPES:
                list_columns.append(column.name)
            elif column.column_type == "DATE" and convert_to_datetime:
                date_columns.append(column.name)

    return csv_to_pandas_df(
        filepath=csv_path,
        separator=separator or DEFAULT_SEPARATOR,
        quote_char=quote_character or DEFAULT_QUOTE_CHARACTER,
        escape_char=escape_character or DEFAULT_ESCAPSE_CHAR,
        row_id_and_version_in_index=False,
        date_columns=date_columns if date_columns else None,
        list_columns=list_columns if list_columns else None,
        **kwargs,
    )

query_part_mask_async async staticmethod

query_part_mask_async(query: str, part_mask: int, *, synapse_client: Optional[Synapse] = None, **kwargs) -> QueryResultOutput

Query for data on a table stored in Synapse. This is a more advanced use case of the query function that allows you to determine what addiitional metadata about the table or query should also be returned. If you do not need this additional information then you are better off using the query function.

The query for this method uses this Rest API: https://rest-docs.synapse.org/rest/POST/entity/id/table/query/async/start.html

PARAMETER DESCRIPTION
query

The query to run. The query must be valid syntax that Synapse can understand. See this document that describes the expected syntax of the query: https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html

TYPE: str

part_mask

The bitwise OR of the part mask values you want to return in the results. The following list of part masks are implemented to be returned in the results:

  • Query Results (queryResults) = 0x1
  • Query Count (queryCount) = 0x2
  • The sum of the file sizes (sumFileSizesBytes) = 0x40
  • The last updated on date of the table (lastUpdatedOn) = 0x80

TYPE: int

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
QueryResultOutput

The results of the query as a QueryResultOutput object.

Querying for data with a part mask

This example shows how to use the bitwise OR of Python to combine the part mask values and then use that to query for data in a table and print out the results.

In this case we are getting the results of the query, the count of rows, and the last updated on date of the table.

import asyncio
from synapseclient import Synapse
from synapseclient.models import query_part_mask_async

syn = Synapse()
syn.login()

QUERY_RESULTS = 0x1
QUERY_COUNT = 0x2
LAST_UPDATED_ON = 0x80

# Combine the part mask values using bitwise OR
part_mask = QUERY_RESULTS | QUERY_COUNT | LAST_UPDATED_ON


async def main():
    result = await query_part_mask_async(query="SELECT * FROM syn1234", part_mask=part_mask)
    print(result)

asyncio.run(main())
Source code in synapseclient/models/mixins/table_components.py
2763
2764
2765
2766
2767
2768
2769
2770
2771
2772
2773
2774
2775
2776
2777
2778
2779
2780
2781
2782
2783
2784
2785
2786
2787
2788
2789
2790
2791
2792
2793
2794
2795
2796
2797
2798
2799
2800
2801
2802
2803
2804
2805
2806
2807
2808
2809
2810
2811
2812
2813
2814
2815
2816
2817
2818
2819
2820
2821
2822
2823
2824
2825
2826
2827
2828
2829
2830
2831
2832
2833
2834
2835
2836
2837
2838
2839
2840
2841
2842
2843
2844
2845
2846
2847
2848
2849
2850
2851
2852
2853
2854
2855
2856
2857
2858
2859
2860
2861
@staticmethod
async def query_part_mask_async(
    query: str,
    part_mask: int,
    *,
    synapse_client: Optional[Synapse] = None,
    **kwargs,
) -> "QueryResultOutput":
    """Query for data on a table stored in Synapse. This is a more advanced use case
    of the `query` function that allows you to determine what addiitional metadata
    about the table or query should also be returned. If you do not need this
    additional information then you are better off using the `query` function.

    The query for this method uses this Rest API:
    <https://rest-docs.synapse.org/rest/POST/entity/id/table/query/async/start.html>

    Arguments:
        query: The query to run. The query must be valid syntax that Synapse can
            understand. See this document that describes the expected syntax of the
            query:
            <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>
        part_mask: The bitwise OR of the part mask values you want to return in the
            results. The following list of part masks are implemented to be returned
            in the results:

            - Query Results (queryResults) = 0x1
            - Query Count (queryCount) = 0x2
            - The sum of the file sizes (sumFileSizesBytes) = 0x40
            - The last updated on date of the table (lastUpdatedOn) = 0x80

        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        The results of the query as a QueryResultOutput object.

    Example: Querying for data with a part mask
        This example shows how to use the bitwise `OR` of Python to combine the
        part mask values and then use that to query for data in a table and print
        out the results.

        In this case we are getting the results of the query, the count of rows, and
        the last updated on date of the table.

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import query_part_mask_async

        syn = Synapse()
        syn.login()

        QUERY_RESULTS = 0x1
        QUERY_COUNT = 0x2
        LAST_UPDATED_ON = 0x80

        # Combine the part mask values using bitwise OR
        part_mask = QUERY_RESULTS | QUERY_COUNT | LAST_UPDATED_ON


        async def main():
            result = await query_part_mask_async(query="SELECT * FROM syn1234", part_mask=part_mask)
            print(result)

        asyncio.run(main())
        ```
    """
    loop = asyncio.get_event_loop()

    client = Synapse.get_client(synapse_client=synapse_client)
    client.logger.info(f"Running query: {query}")
    limit = kwargs.get("limit", None)
    offset = kwargs.get("offset", None)

    results = await _table_query(
        query=query,
        results_as="rowset",
        part_mask=part_mask,
        limit=limit,
        offset=offset,
    )

    as_df = await loop.run_in_executor(
        None,
        lambda: _rowset_to_pandas_df(
            query_result_bundle=results,
            synapse=client,
            row_id_and_version_in_index=False,
        ),
    )
    return QueryResultOutput.fill_from_dict(
        result=as_df,
        data={
            "count": results.query_count,
            "last_updated_on": results.last_updated_on,
            "sum_file_sizes": results.sum_file_sizes,
        },
    )

add_column

add_column(column: Union[Column, List[Column]], index: int = None) -> None

Add column(s) to the table. Note that this does not store the column(s) in Synapse. You must call the .store() function on this table class instance to store the column(s) in Synapse. This is a convenience function to eliminate the need to manually add the column(s) to the dictionary.

This function will add an item to the .columns attribute of this class instance. .columns is a dictionary where the key is the name of the column and the value is the Column object.

PARAMETER DESCRIPTION
column

The column(s) to add, may be a single Column object or a list of Column objects.

TYPE: Union[Column, List[Column]]

index

The index to insert the column at. If not passed in the column will be added to the end of the list.

TYPE: int DEFAULT: None

RETURNS DESCRIPTION
None

None

Adding a single column

This example shows how you may add a single column to a table and then store the change back in Synapse.

from synapseclient import Synapse
from synapseclient.models import Column, ColumnType, Table

syn = Synapse()
syn.login()

table = Table(
    id="syn1234"
).get(include_columns=True)

table.add_column(
    Column(name="my_column", column_type=ColumnType.STRING)
)
table.store()
Adding multiple columns

This example shows how you may add multiple columns to a table and then store the change back in Synapse.

from synapseclient import Synapse
from synapseclient.models import Column, ColumnType, Table

syn = Synapse()
syn.login()

table = Table(
    id="syn1234"
).get(include_columns=True)

table.add_column([
    Column(name="my_column", column_type=ColumnType.STRING),
    Column(name="my_column2", column_type=ColumnType.INTEGER),
])
table.store()
Adding a column at a specific index

This example shows how you may add a column at a specific index to a table and then store the change back in Synapse. If the index is out of bounds the column will be added to the end of the list.

from synapseclient import Synapse
from synapseclient.models import Column, ColumnType, Table

syn = Synapse()
syn.login()

table = Table(
    id="syn1234"
).get(include_columns=True)

table.add_column(
    Column(name="my_column", column_type=ColumnType.STRING),
    # Add the column at the beginning of the list
    index=0
)
table.store()
Adding a single column (async)

This example shows how you may add a single column to a table and then store the change back in Synapse.

import asyncio
from synapseclient import Synapse
from synapseclient.models import Column, ColumnType, Table

syn = Synapse()
syn.login()

async def main():
    table = await Table(
        id="syn1234"
    ).get_async(include_columns=True)

    table.add_column(
        Column(name="my_column", column_type=ColumnType.STRING)
    )
    await table.store_async()

asyncio.run(main())
Adding multiple columns (async)

This example shows how you may add multiple columns to a table and then store the change back in Synapse.

import asyncio
from synapseclient import Synapse
from synapseclient.models import Column, ColumnType, Table

syn = Synapse()
syn.login()

async def main():
    table = await Table(
        id="syn1234"
    ).get_async(include_columns=True)

    table.add_column([
        Column(name="my_column", column_type=ColumnType.STRING),
        Column(name="my_column2", column_type=ColumnType.INTEGER),
    ])
    await table.store_async()

asyncio.run(main())
Adding a column at a specific index (async)

This example shows how you may add a column at a specific index to a table and then store the change back in Synapse. If the index is out of bounds the column will be added to the end of the list.

import asyncio
from synapseclient import Synapse
from synapseclient.models import Column, ColumnType, Table

syn = Synapse()
syn.login()

async def main():
    table = await Table(
        id="syn1234"
    ).get_async(include_columns=True)

    table.add_column(
        Column(name="my_column", column_type=ColumnType.STRING),
        # Add the column at the beginning of the list
        index=0
    )
    await table.store_async()

asyncio.run(main())
Source code in synapseclient/models/mixins/table_components.py
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
def add_column(
    self, column: Union["Column", List["Column"]], index: int = None
) -> None:
    """Add column(s) to the table. Note that this does not store the column(s) in
    Synapse. You must call the `.store()` function on this table class instance to
    store the column(s) in Synapse. This is a convenience function to eliminate
    the need to manually add the column(s) to the dictionary.


    This function will add an item to the `.columns` attribute of this class
    instance. `.columns` is a dictionary where the key is the name of the column
    and the value is the Column object.

    Arguments:
        column: The column(s) to add, may be a single Column object or a list of
            Column objects.
        index: The index to insert the column at. If not passed in the column will
            be added to the end of the list.

    Returns:
        None

    Example: Adding a single column
        This example shows how you may add a single column to a table and then store
        the change back in Synapse.

        ```python
        from synapseclient import Synapse
        from synapseclient.models import Column, ColumnType, Table

        syn = Synapse()
        syn.login()

        table = Table(
            id="syn1234"
        ).get(include_columns=True)

        table.add_column(
            Column(name="my_column", column_type=ColumnType.STRING)
        )
        table.store()
        ```


    Example: Adding multiple columns
        This example shows how you may add multiple columns to a table and then store
        the change back in Synapse.

        ```python
        from synapseclient import Synapse
        from synapseclient.models import Column, ColumnType, Table

        syn = Synapse()
        syn.login()

        table = Table(
            id="syn1234"
        ).get(include_columns=True)

        table.add_column([
            Column(name="my_column", column_type=ColumnType.STRING),
            Column(name="my_column2", column_type=ColumnType.INTEGER),
        ])
        table.store()
        ```

    Example: Adding a column at a specific index
        This example shows how you may add a column at a specific index to a table
        and then store the change back in Synapse. If the index is out of bounds the
        column will be added to the end of the list.

        ```python
        from synapseclient import Synapse
        from synapseclient.models import Column, ColumnType, Table

        syn = Synapse()
        syn.login()

        table = Table(
            id="syn1234"
        ).get(include_columns=True)

        table.add_column(
            Column(name="my_column", column_type=ColumnType.STRING),
            # Add the column at the beginning of the list
            index=0
        )
        table.store()
        ```

    Example: Adding a single column (async)
        This example shows how you may add a single column to a table and then store
        the change back in Synapse.

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Column, ColumnType, Table

        syn = Synapse()
        syn.login()

        async def main():
            table = await Table(
                id="syn1234"
            ).get_async(include_columns=True)

            table.add_column(
                Column(name="my_column", column_type=ColumnType.STRING)
            )
            await table.store_async()

        asyncio.run(main())
        ```

    Example: Adding multiple columns (async)
        This example shows how you may add multiple columns to a table and then store
        the change back in Synapse.

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Column, ColumnType, Table

        syn = Synapse()
        syn.login()

        async def main():
            table = await Table(
                id="syn1234"
            ).get_async(include_columns=True)

            table.add_column([
                Column(name="my_column", column_type=ColumnType.STRING),
                Column(name="my_column2", column_type=ColumnType.INTEGER),
            ])
            await table.store_async()

        asyncio.run(main())
        ```

    Example: Adding a column at a specific index (async)
        This example shows how you may add a column at a specific index to a table
        and then store the change back in Synapse. If the index is out of bounds the
        column will be added to the end of the list.

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Column, ColumnType, Table

        syn = Synapse()
        syn.login()

        async def main():
            table = await Table(
                id="syn1234"
            ).get_async(include_columns=True)

            table.add_column(
                Column(name="my_column", column_type=ColumnType.STRING),
                # Add the column at the beginning of the list
                index=0
            )
            await table.store_async()

        asyncio.run(main())
        ```
    """
    if not self._last_persistent_instance:
        raise ValueError(
            "This method is only supported after interacting with Synapse via a `.get()` or `.store()` operation"
        )

    if index is not None:
        if isinstance(column, list):
            columns_to_insert = []
            for i, col in enumerate(column):
                if col.name in self.columns:
                    raise ValueError(f"Duplicate column name: {col.name}")
                columns_to_insert.append((col.name, col))
            insert_index = min(index, len(self.columns))
            self.columns = OrderedDict(
                list(self.columns.items())[:insert_index]
                + columns_to_insert
                + list(self.columns.items())[insert_index:]
            )
        else:
            if column.name in self.columns:
                raise ValueError(f"Duplicate column name: {column.name}")
            insert_index = min(index, len(self.columns))
            self.columns = OrderedDict(
                list(self.columns.items())[:insert_index]
                + [(column.name, column)]
                + list(self.columns.items())[insert_index:]
            )

    else:
        if isinstance(column, list):
            for col in column:
                if col.name in self.columns:
                    raise ValueError(f"Duplicate column name: {col.name}")
                self.columns[col.name] = col
        else:
            if column.name in self.columns:
                raise ValueError(f"Duplicate column name: {column.name}")
            self.columns[column.name] = column

delete_column

delete_column(name: str) -> None

Mark a column for deletion. Note that this does not delete the column from Synapse. You must call the .store() function on this table class instance to delete the column from Synapse. This is a convenience function to eliminate the need to manually delete the column from the dictionary and add it to the ._columns_to_delete attribute.

PARAMETER DESCRIPTION
name

The name of the column to delete.

TYPE: str

RETURNS DESCRIPTION
None

None

Deleting a column

This example shows how you may delete a column from a table and then store the change back in Synapse.

from synapseclient import Synapse
from synapseclient.models import Table

syn = Synapse()
syn.login()

table = Table(
    id="syn1234"
).get(include_columns=True)

table.delete_column(name="my_column")
table.store()
Deleting a column (async)

This example shows how you may delete a column from a table and then store the change back in Synapse.

import asyncio
from synapseclient import Synapse
from synapseclient.models import Table

syn = Synapse()
syn.login()

async def main():
    table = await Table(
        id="syn1234"
    ).get_async(include_columns=True)

    table.delete_column(name="my_column")
    table.store_async()

asyncio.run(main())
Source code in synapseclient/models/mixins/table_components.py
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
def delete_column(self, name: str) -> None:
    """
    Mark a column for deletion. Note that this does not delete the column from
    Synapse. You must call the `.store()` function on this table class instance to
    delete the column from Synapse. This is a convenience function to eliminate
    the need to manually delete the column from the dictionary and add it to the
    `._columns_to_delete` attribute.

    Arguments:
        name: The name of the column to delete.

    Returns:
        None

    Example: Deleting a column
        This example shows how you may delete a column from a table and then store
        the change back in Synapse.

        ```python
        from synapseclient import Synapse
        from synapseclient.models import Table

        syn = Synapse()
        syn.login()

        table = Table(
            id="syn1234"
        ).get(include_columns=True)

        table.delete_column(name="my_column")
        table.store()
        ```

    Example: Deleting a column (async)
        This example shows how you may delete a column from a table and then store
        the change back in Synapse.

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Table

        syn = Synapse()
        syn.login()

        async def main():
            table = await Table(
                id="syn1234"
            ).get_async(include_columns=True)

            table.delete_column(name="my_column")
            table.store_async()

        asyncio.run(main())
        ```
    """
    if not self._last_persistent_instance:
        raise ValueError(
            "This method is only supported after interacting with Synapse via a `.get()` or `.store()` operation"
        )
    if not self.columns:
        raise ValueError(
            "There are no columns. Make sure you use the `include_columns` parameter in the `.get()` method."
        )

    column_to_delete = self.columns.get(name, None)
    if not column_to_delete:
        raise ValueError(f"Column with name {name} does not exist in the table.")

    self._columns_to_delete[column_to_delete.id] = column_to_delete
    self.columns.pop(column_to_delete.name, None)

reorder_column

reorder_column(name: str, index: int) -> None

Reorder a column in the table. Note that this does not store the column in Synapse. You must call the .store() function on this table class instance to store the column in Synapse. This is a convenience function to eliminate the need to manually reorder the .columns attribute dictionary.

You must ensure that the index is within the bounds of the number of columns in the table. If you pass in an index that is out of bounds the column will be added to the end of the list.

PARAMETER DESCRIPTION
name

The name of the column to reorder.

TYPE: str

index

The index to move the column to starting with 0.

TYPE: int

RETURNS DESCRIPTION
None

None

Reordering a column

This example shows how you may reorder a column in a table and then store the change back in Synapse.

from synapseclient import Synapse
from synapseclient.models import Column, ColumnType, Table

syn = Synapse()
syn.login()

table = Table(
    id="syn1234"
).get(include_columns=True)

# Move the column to the beginning of the list
table.reorder_column(name="my_column", index=0)
table.store()
Reordering a column (async)

This example shows how you may reorder a column in a table and then store the change back in Synapse.

import asyncio
from synapseclient import Synapse
from synapseclient.models import Column, ColumnType, Table

syn = Synapse()
syn.login()

async def main():
    table = await Table(
        id="syn1234"
    ).get_async(include_columns=True)

    # Move the column to the beginning of the list
    table.reorder_column(name="my_column", index=0)
    table.store_async()

asyncio.run(main())
Source code in synapseclient/models/mixins/table_components.py
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
def reorder_column(self, name: str, index: int) -> None:
    """Reorder a column in the table. Note that this does not store the column in
    Synapse. You must call the `.store()` function on this table class instance to
    store the column in Synapse. This is a convenience function to eliminate
    the need to manually reorder the `.columns` attribute dictionary.

    You must ensure that the index is within the bounds of the number of columns in
    the table. If you pass in an index that is out of bounds the column will be
    added to the end of the list.

    Arguments:
        name: The name of the column to reorder.
        index: The index to move the column to starting with 0.

    Returns:
        None

    Example: Reordering a column
        This example shows how you may reorder a column in a table and then store
        the change back in Synapse.

        ```python
        from synapseclient import Synapse
        from synapseclient.models import Column, ColumnType, Table

        syn = Synapse()
        syn.login()

        table = Table(
            id="syn1234"
        ).get(include_columns=True)

        # Move the column to the beginning of the list
        table.reorder_column(name="my_column", index=0)
        table.store()
        ```


    Example: Reordering a column (async)
        This example shows how you may reorder a column in a table and then store
        the change back in Synapse.

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Column, ColumnType, Table

        syn = Synapse()
        syn.login()

        async def main():
            table = await Table(
                id="syn1234"
            ).get_async(include_columns=True)

            # Move the column to the beginning of the list
            table.reorder_column(name="my_column", index=0)
            table.store_async()

        asyncio.run(main())
        ```
    """
    if not self._last_persistent_instance:
        raise ValueError(
            "This method is only supported after interacting with Synapse via a `.get()` or `.store()` operation"
        )

    column_to_reorder = self.columns.pop(name, None)
    if index >= len(self.columns):
        self.columns[name] = column_to_reorder
        return self

    self.columns = OrderedDict(
        list(self.columns.items())[:index]
        + [(name, column_to_reorder)]
        + list(self.columns.items())[index:]
    )

get_permissions_async async

get_permissions_async(*, synapse_client: Optional[Synapse] = None) -> Permissions

Get the permissions that the caller has on an Entity.

PARAMETER DESCRIPTION
synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
Permissions

A Permissions object

Using this function:

Getting permissions for a Synapse Entity

import asyncio
from synapseclient import Synapse
from synapseclient.models import File

syn = Synapse()
syn.login()

async def main():
    permissions = await File(id="syn123").get_permissions_async()

asyncio.run(main())

Getting access types list from the Permissions object

permissions.access_types
Source code in synapseclient/models/mixins/access_control.py
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
async def get_permissions_async(
    self,
    *,
    synapse_client: Optional[Synapse] = None,
) -> "Permissions":
    """
    Get the [permissions][synapseclient.core.models.permission.Permissions]
    that the caller has on an Entity.

    Arguments:
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        A Permissions object


    Example: Using this function:
        Getting permissions for a Synapse Entity

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import File

        syn = Synapse()
        syn.login()

        async def main():
            permissions = await File(id="syn123").get_permissions_async()

        asyncio.run(main())
        ```

        Getting access types list from the Permissions object

        ```
        permissions.access_types
        ```
    """
    from synapseclient.core.models.permission import Permissions

    permissions_dict = await get_entity_permissions(
        entity_id=self.id,
        synapse_client=synapse_client,
    )
    return Permissions.from_dict(data=permissions_dict)

get_acl_async async

get_acl_async(principal_id: int = None, check_benefactor: bool = True, *, synapse_client: Optional[Synapse] = None) -> List[str]

Get the ACL that a user or group has on an Entity.

Note: If the entity does not have local sharing settings, or ACL set directly on it, this will look up the ACL on the benefactor of the entity. The benefactor is the entity that the current entity inherits its permissions from. The benefactor is usually the parent entity, but it can be any ancestor in the hierarchy. For example, a newly created Project will be its own benefactor, while a new FileEntity's benefactor will start off as its containing Project or Folder. If the entity already has local sharing settings, the benefactor would be itself.

PARAMETER DESCRIPTION
principal_id

Identifier of a user or group (defaults to PUBLIC users)

TYPE: int DEFAULT: None

check_benefactor

If True (default), check the benefactor for the entity to get the ACL. If False, only check the entity itself. This is useful for checking the ACL of an entity that has local sharing settings, but you want to check the ACL of the entity itself and not the benefactor it may inherit from.

TYPE: bool DEFAULT: True

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
List[str]

An array containing some combination of ['READ', 'UPDATE', 'CREATE', 'DELETE', 'DOWNLOAD', 'MODERATE', 'CHANGE_PERMISSIONS', 'CHANGE_SETTINGS'] or an empty array

Source code in synapseclient/models/mixins/access_control.py
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
async def get_acl_async(
    self,
    principal_id: int = None,
    check_benefactor: bool = True,
    *,
    synapse_client: Optional[Synapse] = None,
) -> List[str]:
    """
    Get the [ACL][synapseclient.core.models.permission.Permissions.access_types]
    that a user or group has on an Entity.

    Note: If the entity does not have local sharing settings, or ACL set directly
    on it, this will look up the ACL on the benefactor of the entity. The
    benefactor is the entity that the current entity inherits its permissions from.
    The benefactor is usually the parent entity, but it can be any ancestor in the
    hierarchy. For example, a newly created Project will be its own benefactor,
    while a new FileEntity's benefactor will start off as its containing Project or
    Folder. If the entity already has local sharing settings, the benefactor would
    be itself.

    Arguments:
        principal_id: Identifier of a user or group (defaults to PUBLIC users)
        check_benefactor: If True (default), check the benefactor for the entity
            to get the ACL. If False, only check the entity itself.
            This is useful for checking the ACL of an entity that has local sharing
            settings, but you want to check the ACL of the entity itself and not
            the benefactor it may inherit from.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        An array containing some combination of
            ['READ', 'UPDATE', 'CREATE', 'DELETE', 'DOWNLOAD', 'MODERATE',
            'CHANGE_PERMISSIONS', 'CHANGE_SETTINGS']
            or an empty array
    """
    return await get_entity_acl_list(
        entity_id=self.id,
        principal_id=str(principal_id) if principal_id is not None else None,
        check_benefactor=check_benefactor,
        synapse_client=synapse_client,
    )

set_permissions_async async

set_permissions_async(principal_id: int = None, access_type: List[str] = None, modify_benefactor: bool = False, warn_if_inherits: bool = True, overwrite: bool = True, *, synapse_client: Optional[Synapse] = None) -> Dict[str, Union[str, list]]

Sets permission that a user or group has on an Entity. An Entity may have its own ACL or inherit its ACL from a benefactor.

PARAMETER DESCRIPTION
principal_id

Identifier of a user or group. 273948 is for all registered Synapse users and 273949 is for public access. None implies public access.

TYPE: int DEFAULT: None

access_type

Type of permission to be granted. One or more of CREATE, READ, DOWNLOAD, UPDATE, DELETE, CHANGE_PERMISSIONS.

Defaults to ['READ', 'DOWNLOAD']

TYPE: List[str] DEFAULT: None

modify_benefactor

Set as True when modifying a benefactor's ACL. The term 'benefactor' is used to indicate which Entity an Entity inherits its ACL from. For example, a newly created Project will be its own benefactor, while a new FileEntity's benefactor will start off as its containing Project. If the entity already has local sharing settings the benefactor would be itself. It may also be the immediate parent, somewhere in the parent tree, or the project itself.

TYPE: bool DEFAULT: False

warn_if_inherits

When modify_benefactor is True, this does not have any effect. When modify_benefactor is False, and warn_if_inherits is True, a warning log message is produced if the benefactor for the entity you passed into the function is not itself, i.e., it's the parent folder, or another entity in the parent tree.

TYPE: bool DEFAULT: True

overwrite

By default this function overwrites existing permissions for the specified user. Set this flag to False to add new permissions non-destructively.

TYPE: bool DEFAULT: True

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
Dict[str, Union[str, list]]
Setting permissions

Grant all registered users download access

import asyncio
from synapseclient import Synapse
from synapseclient.models import File

syn = Synapse()
syn.login()

async def main():
    await File(id="syn123").set_permissions_async(principal_id=273948, access_type=['READ','DOWNLOAD'])

asyncio.run(main())

Grant the public view access

import asyncio
from synapseclient import Synapse
from synapseclient.models import File

syn = Synapse()
syn.login()

async def main():
    await File(id="syn123").set_permissions_async(principal_id=273949, access_type=['READ'])

asyncio.run(main())
Source code in synapseclient/models/mixins/access_control.py
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
async def set_permissions_async(
    self,
    principal_id: int = None,
    access_type: List[str] = None,
    modify_benefactor: bool = False,
    warn_if_inherits: bool = True,
    overwrite: bool = True,
    *,
    synapse_client: Optional[Synapse] = None,
) -> Dict[str, Union[str, list]]:
    """
    Sets permission that a user or group has on an Entity.
    An Entity may have its own ACL or inherit its ACL from a benefactor.

    Arguments:
        principal_id: Identifier of a user or group. `273948` is for all
            registered Synapse users and `273949` is for public access.
            None implies public access.
        access_type: Type of permission to be granted. One or more of CREATE,
            READ, DOWNLOAD, UPDATE, DELETE, CHANGE_PERMISSIONS.

            **Defaults to ['READ', 'DOWNLOAD']**
        modify_benefactor: Set as True when modifying a benefactor's ACL. The term
            'benefactor' is used to indicate which Entity an Entity inherits its
            ACL from. For example, a newly created Project will be its own
            benefactor, while a new FileEntity's benefactor will start off as its
            containing Project. If the entity already has local sharing settings
            the benefactor would be itself. It may also be the immediate parent,
            somewhere in the parent tree, or the project itself.
        warn_if_inherits: When `modify_benefactor` is True, this does not have any
            effect. When `modify_benefactor` is False, and `warn_if_inherits` is
            True, a warning log message is produced if the benefactor for the
            entity you passed into the function is not itself, i.e., it's the
            parent folder, or another entity in the parent tree.
        overwrite: By default this function overwrites existing permissions for
            the specified user. Set this flag to False to add new permissions
            non-destructively.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        An Access Control List object matching <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/AccessControlList.html>.

    Example: Setting permissions
        Grant all registered users download access

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import File

        syn = Synapse()
        syn.login()

        async def main():
            await File(id="syn123").set_permissions_async(principal_id=273948, access_type=['READ','DOWNLOAD'])

        asyncio.run(main())
        ```

        Grant the public view access

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import File

        syn = Synapse()
        syn.login()

        async def main():
            await File(id="syn123").set_permissions_async(principal_id=273949, access_type=['READ'])

        asyncio.run(main())
        ```
    """
    if access_type is None:
        access_type = ["READ", "DOWNLOAD"]

    return await set_entity_permissions(
        entity_id=self.id,
        principal_id=str(principal_id) if principal_id is not None else None,
        access_type=access_type,
        modify_benefactor=modify_benefactor,
        warn_if_inherits=warn_if_inherits,
        overwrite=overwrite,
        synapse_client=synapse_client,
    )

delete_permissions_async async

delete_permissions_async(include_self: bool = True, include_container_content: bool = False, recursive: bool = False, target_entity_types: Optional[List[str]] = None, dry_run: bool = False, show_acl_details: bool = True, show_files_in_containers: bool = True, *, synapse_client: Optional[Synapse] = None, _benefactor_tracker: Optional[BenefactorTracker] = None) -> None

Delete the entire Access Control List (ACL) for a given Entity. This is not scoped to a specific user or group, but rather removes all permissions associated with the Entity. After this operation, the Entity will inherit permissions from its benefactor, which is typically its parent entity or the Project it belongs to.

In order to remove permissions for a specific user or group, you should use the set_permissions_async method with the access_type set to an empty list.

By default, Entities such as FileEntity and Folder inherit their permission from their containing Project. For such Entities the Project is the Entity's 'benefactor'. This permission inheritance can be overridden by creating an ACL for the Entity. When this occurs the Entity becomes its own benefactor and all permission are determined by its own ACL.

If the ACL of an Entity is deleted, then its benefactor will automatically be set to its parent's benefactor.

Special notice for Projects: The ACL for a Project cannot be deleted, you must individually update or revoke the permissions for each user or group.

PARAMETER DESCRIPTION
include_self

If True (default), delete the ACL of the current entity. If False, skip deleting the ACL of the current entity.

TYPE: bool DEFAULT: True

include_container_content

If True, delete ACLs from contents directly within containers (files and folders inside self). This must be set to True for recursive to have any effect. Defaults to False.

TYPE: bool DEFAULT: False

recursive

If True and the entity is a container (e.g., Project or Folder), recursively process child containers. Note that this must be used with include_container_content=True to have any effect. Setting recursive=True with include_container_content=False will raise a ValueError. Only works on classes that support the sync_from_synapse_async method.

TYPE: bool DEFAULT: False

target_entity_types

Specify which entity types to process when deleting ACLs. Allowed values are "folder", "file", "project", "table", "entityview", "materializedview", "virtualtable", "dataset", "datasetcollection", "submissionview" (case-insensitive). If None, defaults to ["folder", "file"]. This does not affect the entity type of the current entity, which is always processed if include_self=True.

TYPE: Optional[List[str]] DEFAULT: None

dry_run

If True, log the changes that would be made instead of actually performing the deletions. When enabled, all ACL deletion operations are simulated and logged at info level. Defaults to False.

TYPE: bool DEFAULT: False

show_acl_details

When dry_run=True, controls whether current ACL details are displayed for entities that will have their permissions changed. If True (default), shows detailed ACL information. If False, hides ACL details for cleaner output. Has no effect when dry_run=False.

TYPE: bool DEFAULT: True

show_files_in_containers

When dry_run=True, controls whether files within containers are displayed in the preview. If True (default), shows all files. If False, hides files when their only change is benefactor inheritance (but still shows files with local ACLs being deleted). Has no effect when dry_run=False.

TYPE: bool DEFAULT: True

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

_benefactor_tracker

Internal use tracker for managing benefactor relationships. Used for recursive functionality to track which entities will be affected

TYPE: Optional[BenefactorTracker] DEFAULT: None

RETURNS DESCRIPTION
None

None

RAISES DESCRIPTION
ValueError

If the entity does not have an ID or if an invalid entity type is provided.

SynapseHTTPError

If there are permission issues or if the entity already inherits permissions.

Exception

For any other errors that may occur during the process.

Note: The caller must be granted ACCESS_TYPE.CHANGE_PERMISSIONS on the Entity to call this method.

Delete permissions for a single entity
import asyncio
from synapseclient import Synapse
from synapseclient.models import File

syn = Synapse()
syn.login()

async def main():
    await File(id="syn123").delete_permissions_async()

asyncio.run(main())
Delete permissions recursively for a folder and all its children
import asyncio
from synapseclient import Synapse
from synapseclient.models import Folder

syn = Synapse()
syn.login()

async def main():
    # Delete permissions for this folder only (does not affect children)
    await Folder(id="syn123").delete_permissions_async()

    # Delete permissions for all files and folders directly within this folder,
    # but not the folder itself
    await Folder(id="syn123").delete_permissions_async(
        include_self=False,
        include_container_content=True
    )

    # Delete permissions for all items in the entire hierarchy (folders and their files)
    # Both recursive and include_container_content must be True
    await Folder(id="syn123").delete_permissions_async(
        recursive=True,
        include_container_content=True
    )

    # Delete permissions only for folder entities within this folder recursively
    # and their contents
    await Folder(id="syn123").delete_permissions_async(
        recursive=True,
        include_container_content=True,
        target_entity_types=["folder"]
    )

    # Delete permissions only for files within this folder and all subfolders
    await Folder(id="syn123").delete_permissions_async(
        include_self=False,
        recursive=True,
        include_container_content=True,
        target_entity_types=["file"]
    )

    # Delete permissions for specific entity types (e.g., tables and views)
    await Folder(id="syn123").delete_permissions_async(
        recursive=True,
        include_container_content=True,
        target_entity_types=["table", "entityview", "materializedview"]
    )

    # Dry run example: Log what would be deleted without making changes
    await Folder(id="syn123").delete_permissions_async(
        recursive=True,
        include_container_content=True,
        dry_run=True
    )
asyncio.run(main())
Source code in synapseclient/models/mixins/access_control.py
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
async def delete_permissions_async(
    self,
    include_self: bool = True,
    include_container_content: bool = False,
    recursive: bool = False,
    target_entity_types: Optional[List[str]] = None,
    dry_run: bool = False,
    show_acl_details: bool = True,
    show_files_in_containers: bool = True,
    *,
    synapse_client: Optional[Synapse] = None,
    _benefactor_tracker: Optional[BenefactorTracker] = None,
) -> None:
    """
    Delete the entire Access Control List (ACL) for a given Entity. This is not
    scoped to a specific user or group, but rather removes all permissions
    associated with the Entity. After this operation, the Entity will inherit
    permissions from its benefactor, which is typically its parent entity or
    the Project it belongs to.

    In order to remove permissions for a specific user or group, you
    should use the `set_permissions_async` method with the `access_type` set to
    an empty list.

    By default, Entities such as FileEntity and Folder inherit their permission from
    their containing Project. For such Entities the Project is the Entity's 'benefactor'.
    This permission inheritance can be overridden by creating an ACL for the Entity.
    When this occurs the Entity becomes its own benefactor and all permission are
    determined by its own ACL.

    If the ACL of an Entity is deleted, then its benefactor will automatically be set
    to its parent's benefactor.

    **Special notice for Projects:** The ACL for a Project cannot be deleted, you
    must individually update or revoke the permissions for each user or group.

    Arguments:
        include_self: If True (default), delete the ACL of the current entity.
            If False, skip deleting the ACL of the current entity.
        include_container_content: If True, delete ACLs from contents directly within
            containers (files and folders inside self). This must be set to
            True for recursive to have any effect. Defaults to False.
        recursive: If True and the entity is a container (e.g., Project or Folder),
            recursively process child containers. Note that this must be used with
            include_container_content=True to have any effect. Setting recursive=True
            with include_container_content=False will raise a ValueError.
            Only works on classes that support the `sync_from_synapse_async` method.
        target_entity_types: Specify which entity types to process when deleting ACLs.
            Allowed values are "folder", "file", "project", "table", "entityview",
            "materializedview", "virtualtable", "dataset", "datasetcollection",
            "submissionview" (case-insensitive). If None, defaults to ["folder", "file"].
            This does not affect the entity type of the current entity, which is always
            processed if `include_self=True`.
        dry_run: If True, log the changes that would be made instead of actually
            performing the deletions. When enabled, all ACL deletion operations are
            simulated and logged at info level. Defaults to False.
        show_acl_details: When dry_run=True, controls whether current ACL details are
            displayed for entities that will have their permissions changed. If True (default),
            shows detailed ACL information. If False, hides ACL details for cleaner output.
            Has no effect when dry_run=False.
        show_files_in_containers: When dry_run=True, controls whether files within containers
            are displayed in the preview. If True (default), shows all files. If False, hides
            files when their only change is benefactor inheritance (but still shows files with
            local ACLs being deleted). Has no effect when dry_run=False.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.
        _benefactor_tracker: Internal use tracker for managing benefactor relationships.
            Used for recursive functionality to track which entities will be affected

    Returns:
        None

    Raises:
        ValueError: If the entity does not have an ID or if an invalid entity type is provided.
        SynapseHTTPError: If there are permission issues or if the entity already inherits permissions.
        Exception: For any other errors that may occur during the process.

    Note: The caller must be granted ACCESS_TYPE.CHANGE_PERMISSIONS on the Entity to
    call this method.

    Example: Delete permissions for a single entity
        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import File

        syn = Synapse()
        syn.login()

        async def main():
            await File(id="syn123").delete_permissions_async()

        asyncio.run(main())
        ```

    Example: Delete permissions recursively for a folder and all its children
        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Folder

        syn = Synapse()
        syn.login()

        async def main():
            # Delete permissions for this folder only (does not affect children)
            await Folder(id="syn123").delete_permissions_async()

            # Delete permissions for all files and folders directly within this folder,
            # but not the folder itself
            await Folder(id="syn123").delete_permissions_async(
                include_self=False,
                include_container_content=True
            )

            # Delete permissions for all items in the entire hierarchy (folders and their files)
            # Both recursive and include_container_content must be True
            await Folder(id="syn123").delete_permissions_async(
                recursive=True,
                include_container_content=True
            )

            # Delete permissions only for folder entities within this folder recursively
            # and their contents
            await Folder(id="syn123").delete_permissions_async(
                recursive=True,
                include_container_content=True,
                target_entity_types=["folder"]
            )

            # Delete permissions only for files within this folder and all subfolders
            await Folder(id="syn123").delete_permissions_async(
                include_self=False,
                recursive=True,
                include_container_content=True,
                target_entity_types=["file"]
            )

            # Delete permissions for specific entity types (e.g., tables and views)
            await Folder(id="syn123").delete_permissions_async(
                recursive=True,
                include_container_content=True,
                target_entity_types=["table", "entityview", "materializedview"]
            )

            # Dry run example: Log what would be deleted without making changes
            await Folder(id="syn123").delete_permissions_async(
                recursive=True,
                include_container_content=True,
                dry_run=True
            )
        asyncio.run(main())
        ```
    """
    if not self.id:
        raise ValueError("The entity must have an ID to delete permissions.")

    client = Synapse.get_client(synapse_client=synapse_client)

    if include_self and self.__class__.__name__.lower() == "project":
        client.logger.warning(
            "The ACL for a Project cannot be deleted, you must individually update or "
            "revoke the permissions for each user or group. Continuing without deleting "
            "the Project's ACL."
        )
        include_self = False

    normalized_types = self._normalize_target_entity_types(target_entity_types)

    is_top_level = not _benefactor_tracker
    benefactor_tracker = _benefactor_tracker or BenefactorTracker()

    should_process_children = (recursive or include_container_content) and hasattr(
        self, "sync_from_synapse_async"
    )
    all_entities = [self] if include_self else []

    custom_message = "Deleting ACLs [Dry Run]..." if dry_run else "Deleting ACLs..."
    with shared_download_progress_bar(
        file_size=1, synapse_client=client, custom_message=custom_message, unit=None
    ) as progress_bar:
        if progress_bar:
            progress_bar.update(1)  # Initial setup complete

        if should_process_children:
            if recursive and not include_container_content:
                raise ValueError(
                    "When recursive=True, include_container_content must also be True. "
                    "Setting recursive=True with include_container_content=False has no effect."
                )

            if progress_bar:
                progress_bar.total += 1
                progress_bar.refresh()

            all_entities = await self._collect_entities(
                client=client,
                target_entity_types=normalized_types,
                include_container_content=include_container_content,
                recursive=recursive,
                progress_bar=progress_bar,
            )
            if progress_bar:
                progress_bar.update(1)

            entity_ids = [entity.id for entity in all_entities if entity.id]
            if entity_ids:
                if progress_bar:
                    progress_bar.total += 1
                    progress_bar.refresh()
                await benefactor_tracker.track_entity_benefactor(
                    entity_ids=entity_ids,
                    synapse_client=client,
                    progress_bar=progress_bar,
                )
            else:
                if progress_bar:
                    progress_bar.total += 1
                    progress_bar.refresh()
                    progress_bar.update(1)

        if is_top_level:
            if progress_bar:
                progress_bar.total += 1
                progress_bar.refresh()
            await self._build_and_log_run_tree(
                client=client,
                benefactor_tracker=benefactor_tracker,
                collected_entities=all_entities,
                include_self=include_self,
                show_acl_details=show_acl_details,
                show_files_in_containers=show_files_in_containers,
                progress_bar=progress_bar,
                dry_run=dry_run,
            )

        if dry_run:
            return

        if include_self:
            if progress_bar:
                progress_bar.total += 1
                progress_bar.refresh()
            await self._delete_current_entity_acl(
                client=client,
                benefactor_tracker=benefactor_tracker,
                progress_bar=progress_bar,
            )

        if should_process_children:
            if include_container_content:
                if progress_bar:
                    progress_bar.total += 1
                    progress_bar.refresh()
                await self._process_container_contents(
                    client=client,
                    target_entity_types=normalized_types,
                    benefactor_tracker=benefactor_tracker,
                    progress_bar=progress_bar,
                    recursive=recursive,
                    include_container_content=include_container_content,
                )
                if progress_bar:
                    progress_bar.update(1)  # Process container contents complete

list_acl_async async

list_acl_async(recursive: bool = False, include_container_content: bool = False, target_entity_types: Optional[List[str]] = None, log_tree: bool = False, *, synapse_client: Optional[Synapse] = None, _progress_bar: Optional[tqdm] = None) -> AclListResult

List the Access Control Lists (ACLs) for this entity and optionally its children.

This function returns the local sharing settings for the entity and optionally its children. It provides a mapping of all ACLs for the given container/entity.

Important Note: This function returns the LOCAL sharing settings only, not the effective permissions that each Synapse User ID/Team has on the entities. More permissive permissions could be granted via a Team that the user has access to that has permissions on the entity, or through inheritance from parent entities.

PARAMETER DESCRIPTION
recursive

If True and the entity is a container (e.g., Project or Folder), recursively process child containers. Note that this must be used with include_container_content=True to have any effect. Setting recursive=True with include_container_content=False will raise a ValueError. Only works on classes that support the sync_from_synapse_async method.

TYPE: bool DEFAULT: False

include_container_content

If True, include ACLs from contents directly within containers (files and folders inside self). This must be set to True for recursive to have any effect. Defaults to False.

TYPE: bool DEFAULT: False

target_entity_types

Specify which entity types to process when listing ACLs. Allowed values are "folder", "file", "project", "table", "entityview", "materializedview", "virtualtable", "dataset", "datasetcollection", "submissionview" (case-insensitive). If None, defaults to ["folder", "file"].

TYPE: Optional[List[str]] DEFAULT: None

log_tree

If True, logs the ACL results to console in ASCII tree format showing entity hierarchies and their ACL permissions in a tree-like structure. Defaults to False.

TYPE: bool DEFAULT: False

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

_progress_bar

Internal parameter. Progress bar instance to use for updates when called recursively. Should not be used by external callers.

TYPE: Optional[tqdm] DEFAULT: None

RETURNS DESCRIPTION
AclListResult

An AclListResult object containing a structured representation of ACLs where:

AclListResult
  • entity_acls: A list of EntityAcl objects, each representing one entity's ACL
AclListResult
  • Each EntityAcl contains acl_entries (a list of AclEntry objects)
AclListResult
  • Each AclEntry contains the principal_id and their list of permissions
RAISES DESCRIPTION
ValueError

If the entity does not have an ID or if an invalid entity type is provided.

SynapseHTTPError

If there are permission issues accessing ACLs.

Exception

For any other errors that may occur during the process.

List ACLs for a single entity
import asyncio
from synapseclient import Synapse
from synapseclient.models import File

syn = Synapse()
syn.login()

async def main():
    acl_result = await File(id="syn123").list_acl_async()
    print(acl_result)

    # Access entity ACLs (entity_acls is a list, not a dict)
    for entity_acl in acl_result.all_entity_acls:
        if entity_acl.entity_id == "syn123":
            # Access individual ACL entries
            for acl_entry in entity_acl.acl_entries:
                if acl_entry.principal_id == "273948":
                    print(f"Principal 273948 has permissions: {acl_entry.permissions}")

    # I can also access the ACL for the file itself
    print(acl_result.entity_acl)

    print(acl_result)

asyncio.run(main())
List ACLs recursively for a folder and all its children
import asyncio
from synapseclient import Synapse
from synapseclient.models import Folder

syn = Synapse()
syn.login()

async def main():
    acl_result = await Folder(id="syn123").list_acl_async(
        recursive=True,
        include_container_content=True
    )

    # Access each entity's ACL (entity_acls is a list)
    for entity_acl in acl_result.all_entity_acls:
        print(f"Entity {entity_acl.entity_id} has ACL with {len(entity_acl.acl_entries)} principals")

    # I can also access the ACL for the folder itself
    print(acl_result.entity_acl)

    # List ACLs for only folder entities
    folder_acl_result = await Folder(id="syn123").list_acl_async(
        recursive=True,
        include_container_content=True,
        target_entity_types=["folder"]
    )

    # List ACLs for specific entity types (e.g., tables and views)
    table_view_acl_result = await Folder(id="syn123").list_acl_async(
        recursive=True,
        include_container_content=True,
        target_entity_types=["table", "entityview", "materializedview"]
    )

asyncio.run(main())
List ACLs with ASCII tree visualization

When log_tree=True, the ACLs will be logged in a tree format. Additionally, the ascii_tree attribute of the AclListResult will contain the ASCII tree representation of the ACLs.

import asyncio
from synapseclient import Synapse
from synapseclient.models import Folder

syn = Synapse()
syn.login()

async def main():
    acl_result = await Folder(id="syn123").list_acl_async(
        recursive=True,
        include_container_content=True,
        log_tree=True, # Enable ASCII tree logging
    )

    # The ASCII tree representation of the ACLs will also be available
    # in acl_result.ascii_tree
    print(acl_result.ascii_tree)

asyncio.run(main())
Source code in synapseclient/models/mixins/access_control.py
 881
 882
 883
 884
 885
 886
 887
 888
 889
 890
 891
 892
 893
 894
 895
 896
 897
 898
 899
 900
 901
 902
 903
 904
 905
 906
 907
 908
 909
 910
 911
 912
 913
 914
 915
 916
 917
 918
 919
 920
 921
 922
 923
 924
 925
 926
 927
 928
 929
 930
 931
 932
 933
 934
 935
 936
 937
 938
 939
 940
 941
 942
 943
 944
 945
 946
 947
 948
 949
 950
 951
 952
 953
 954
 955
 956
 957
 958
 959
 960
 961
 962
 963
 964
 965
 966
 967
 968
 969
 970
 971
 972
 973
 974
 975
 976
 977
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
async def list_acl_async(
    self,
    recursive: bool = False,
    include_container_content: bool = False,
    target_entity_types: Optional[List[str]] = None,
    log_tree: bool = False,
    *,
    synapse_client: Optional[Synapse] = None,
    _progress_bar: Optional[tqdm] = None,  # Internal parameter for recursive calls
) -> AclListResult:
    """
    List the Access Control Lists (ACLs) for this entity and optionally its children.

    This function returns the local sharing settings for the entity and optionally
    its children. It provides a mapping of all ACLs for the given container/entity.

    **Important Note:** This function returns the LOCAL sharing settings only, not
    the effective permissions that each Synapse User ID/Team has on the entities.
    More permissive permissions could be granted via a Team that the user has access
    to that has permissions on the entity, or through inheritance from parent entities.

    Arguments:
        recursive: If True and the entity is a container (e.g., Project or Folder),
            recursively process child containers. Note that this must be used with
            include_container_content=True to have any effect. Setting recursive=True
            with include_container_content=False will raise a ValueError.
            Only works on classes that support the `sync_from_synapse_async` method.
        include_container_content: If True, include ACLs from contents directly within
            containers (files and folders inside self). This must be set to
            True for recursive to have any effect. Defaults to False.
        target_entity_types: Specify which entity types to process when listing ACLs.
            Allowed values are "folder", "file", "project", "table", "entityview",
            "materializedview", "virtualtable", "dataset", "datasetcollection",
            "submissionview" (case-insensitive). If None, defaults to ["folder", "file"].
        log_tree: If True, logs the ACL results to console in ASCII tree format showing
            entity hierarchies and their ACL permissions in a tree-like structure.
            Defaults to False.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.
        _progress_bar: Internal parameter. Progress bar instance to use for updates
            when called recursively. Should not be used by external callers.

    Returns:
        An AclListResult object containing a structured representation of ACLs where:
        - entity_acls: A list of EntityAcl objects, each representing one entity's ACL
        - Each EntityAcl contains acl_entries (a list of AclEntry objects)
        - Each AclEntry contains the principal_id and their list of permissions

    Raises:
        ValueError: If the entity does not have an ID or if an invalid entity type is provided.
        SynapseHTTPError: If there are permission issues accessing ACLs.
        Exception: For any other errors that may occur during the process.

    Example: List ACLs for a single entity
        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import File

        syn = Synapse()
        syn.login()

        async def main():
            acl_result = await File(id="syn123").list_acl_async()
            print(acl_result)

            # Access entity ACLs (entity_acls is a list, not a dict)
            for entity_acl in acl_result.all_entity_acls:
                if entity_acl.entity_id == "syn123":
                    # Access individual ACL entries
                    for acl_entry in entity_acl.acl_entries:
                        if acl_entry.principal_id == "273948":
                            print(f"Principal 273948 has permissions: {acl_entry.permissions}")

            # I can also access the ACL for the file itself
            print(acl_result.entity_acl)

            print(acl_result)

        asyncio.run(main())
        ```

    Example: List ACLs recursively for a folder and all its children
        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Folder

        syn = Synapse()
        syn.login()

        async def main():
            acl_result = await Folder(id="syn123").list_acl_async(
                recursive=True,
                include_container_content=True
            )

            # Access each entity's ACL (entity_acls is a list)
            for entity_acl in acl_result.all_entity_acls:
                print(f"Entity {entity_acl.entity_id} has ACL with {len(entity_acl.acl_entries)} principals")

            # I can also access the ACL for the folder itself
            print(acl_result.entity_acl)

            # List ACLs for only folder entities
            folder_acl_result = await Folder(id="syn123").list_acl_async(
                recursive=True,
                include_container_content=True,
                target_entity_types=["folder"]
            )

            # List ACLs for specific entity types (e.g., tables and views)
            table_view_acl_result = await Folder(id="syn123").list_acl_async(
                recursive=True,
                include_container_content=True,
                target_entity_types=["table", "entityview", "materializedview"]
            )

        asyncio.run(main())
        ```

    Example: List ACLs with ASCII tree visualization
        When `log_tree=True`, the ACLs will be logged in a tree format. Additionally,
        the `ascii_tree` attribute of the AclListResult will contain the ASCII tree
        representation of the ACLs.

        ```python
        import asyncio
        from synapseclient import Synapse
        from synapseclient.models import Folder

        syn = Synapse()
        syn.login()

        async def main():
            acl_result = await Folder(id="syn123").list_acl_async(
                recursive=True,
                include_container_content=True,
                log_tree=True, # Enable ASCII tree logging
            )

            # The ASCII tree representation of the ACLs will also be available
            # in acl_result.ascii_tree
            print(acl_result.ascii_tree)

        asyncio.run(main())
        ```
    """
    if not self.id:
        raise ValueError("The entity must have an ID to list ACLs.")

    normalized_types = self._normalize_target_entity_types(target_entity_types)
    client = Synapse.get_client(synapse_client=synapse_client)

    all_acls: Dict[str, Dict[str, List[str]]] = {}
    all_entities = []

    # Only update progress bar for self ACL if we're the top-level call (not recursive)
    # When _progress_bar is passed, it means this is a recursive call and the parent
    # is managing progress updates
    update_progress_for_self = _progress_bar is None
    acl = await self._get_current_entity_acl(
        client=client,
        progress_bar=_progress_bar if update_progress_for_self else None,
    )
    if acl is not None:
        all_acls[self.id] = acl
    all_entities.append(self)

    should_process_children = (recursive or include_container_content) and hasattr(
        self, "sync_from_synapse_async"
    )

    if should_process_children and (recursive and not include_container_content):
        raise ValueError(
            "When recursive=True, include_container_content must also be True. "
            "Setting recursive=True with include_container_content=False has no effect."
        )

    if should_process_children and _progress_bar is None:
        with shared_download_progress_bar(
            file_size=1,
            synapse_client=client,
            custom_message="Collecting ACLs...",
            unit=None,
        ) as progress_bar:
            await self._process_children_with_progress(
                client=client,
                normalized_types=normalized_types,
                include_container_content=include_container_content,
                recursive=recursive,
                all_entities=all_entities,
                all_acls=all_acls,
                progress_bar=progress_bar,
            )
            # Ensure progress bar reaches 100% completion
            if progress_bar:
                remaining = (
                    progress_bar.total - progress_bar.n
                    if progress_bar.total > progress_bar.n
                    else 0
                )
                if remaining > 0:
                    progress_bar.update(remaining)
    elif should_process_children:
        await self._process_children_with_progress(
            client=client,
            normalized_types=normalized_types,
            include_container_content=include_container_content,
            recursive=recursive,
            all_entities=all_entities,
            all_acls=all_acls,
            progress_bar=_progress_bar,
        )
    current_acl = all_acls.get(self.id)
    acl_result = AclListResult.from_dict(
        all_acl_dict=all_acls, current_acl_dict=current_acl
    )

    if log_tree:
        logged_tree = await self._log_acl_tree(acl_result, all_entities, client)
        acl_result.ascii_tree = logged_tree

    return acl_result

synapseclient.models.EntityRef dataclass

Represents a reference to the id and version of an entity to be used in Dataset and DatasetCollection objects.

ATTRIBUTE DESCRIPTION
id

The Synapse ID of the entity.

TYPE: str

version

Indicates a specific version of the entity.

TYPE: int

Source code in synapseclient/models/dataset.py
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
@dataclass
class EntityRef:
    """
    Represents a reference to the id and version of an entity to be used in `Dataset` and
    `DatasetCollection` objects.

    Attributes:
        id: The Synapse ID of the entity.
        version: Indicates a specific version of the entity.
    """

    id: str
    """The Synapse ID of the entity."""

    version: int
    """Indicates a specific version of the entity."""

    def to_synapse_request(self):
        """Converts the attributes of an EntityRef instance to a
        request expected of the Synapse REST API."""

        return {
            "entityId": self.id,
            "versionNumber": self.version,
        }

Attributes

id instance-attribute

id: str

The Synapse ID of the entity.

version instance-attribute

version: int

Indicates a specific version of the entity.

Functions

to_synapse_request

to_synapse_request()

Converts the attributes of an EntityRef instance to a request expected of the Synapse REST API.

Source code in synapseclient/models/dataset.py
54
55
56
57
58
59
60
61
def to_synapse_request(self):
    """Converts the attributes of an EntityRef instance to a
    request expected of the Synapse REST API."""

    return {
        "entityId": self.id,
        "versionNumber": self.version,
    }