# CitylifeSim.github.io

Public site of CityLife Sim

View My GitHub Profile

## The CityLife Simluation A High-Fidelity Pedestrian and Vehicle Simulation with Complex Behaviors

This is the official code repository to accompany the paper “CityLifeSim: A High-Fidelity Pedestrian and Vehicle Simulation with Complex Behaviors”. Here we provide python code for generating and running scenarios using the simulation environment as well as links to the datasets and code used for running our experiments

CityLife is a flexible, high-fidelity simulation that allows users to define complex scenarios with essentially unlimited actors, including both pedestrians and vehicles. This tool allows each vehicle and pedestrian to operate with basic intelligence that governs the \emph{low-level} controls needed to maneuver, avoiding collisions, navigating corners, stopping at traffic lights, etc. The high-level controls for each agent then allows the user to define behaviors in an abstract form controlling their sequence of actions (e.g., hurry to this intersection, then cross the road, turn left at the park, following that wait for a bus at the stop, etc.), their speed changes in different legs of the journey, their stopping distances, their susceptibility to be influenced by their environment and their risk taking behavior.

[Paper] (…) Cheng Yao Wang, Eyal Ofek, Daniel McDuff, Oron Nir, Sai Vemprala, Ashish Kapoor, Mar Gonzalez-Franco (2022) “CityLifeSim: A High-Fidelity Pedestrian and Vehicle Simulation with Complex Behaviors” IEEE ICIR

Video

### Dataset

The dataset contains videos (RGB, depth, segmentation frames) of six scenarios. There a total of 128 pedestrians in each video. One of the scenarios is captured from 17 different points of view (i.e., cameras) to simulate static view points, the others are captured from cameras on moving autonomous vehicles. Here are the download link for each scenario:

• 17 cameras covering different waypoints and points of interest. CCTV
• the front camera on a car during sunny weather. car_sunny
• the front camera on a car during rainy weather. car_weather
• the front cameras on a car during snowy weather. car_snowy
• the front camera on a drone.drone_front
• a camera pointing down from a drone. drone_downward

## Generate Your Own Pedestrians Scenarios:

• code/generate_scenario.py provides an example of how to programmatically create scenarios. (TRAVERSE_TYPE: random, a_star)
$python generate_scenario.py --traverse_type <TRAVERSE_TYPE> --out_file <SCENARIO_FILE_NAME>.csv  • CityLife_randomwalk_128_v6.csv shows an example of the CSV output that is generated. ## Run the CityLifeSim • Set up CityLifeSim python client environment • Install Anaconda and Open Anaconda Prompt • Create the conda env for CityLifeSim  conda env create -f \CityLife_v1\citylifesim.yml conda activate citylifesim  • CityLifeSim python client currently runs on AirSim 1.5.0 verion. Newer version won’t work due to the syntx change for some AirSim API. • Run the CityEnv.exe • Please check AirSim guide on how to move around in the enviroment in different modes(ComputerVision, Car, Multirotor) • Modify the settings.json in Documents\AirSim based on your needs. (Use ComputerVision mode in the setting.json for cctv mode) • ComputerVision mode in the setting.json for testing cctv camera mode, Car mode in the setting.json for car camera mode, Multirotor mode in the setting for drome cam mode • Prepare pedestrians scenarios • Put the scenario csv file in the \CityLifeSim\WindowsNoEditor\CityEnv\Saved folder • Run the pedestrian scenarios simluation (CAM_MODE: cctv, car, drone) $ python run_scenario.py --ped_scenario <SCENARIO_FILE_NAME> --cam_mode <CAM_MODE> --recording

• You can add –car_scenario to run the car scenario at the same time. We currently provide Scenario_[1-100] for testing.
• Please check the CausalCity for generating car scenarios.
• The recorded RBB-D images folder is by default present in the Documents folder (or specified in settings) with the timestamp of when the recording started in %Y-%M-%D-%H-%M-%S format.

## Controlling Environment

• Environmental variables (e.g weather, Timne of Day) that can act as confounders in a dataset can be controlled using the AirSim APIs. Please check AirSim Documentation
• code/control_trafficlight.py provides an example of how to programmatically control the traffic lights to override the default one.

## Generating Bouding Boxes from Segmentation Images

• To generate bounding boxes:
  $python seg2bbox.py --folder <RGB-D FOLDER> --seg_rgbs <FILE_PATH> --save_image  • Read the peds_bbox.json and plot the bboxes $ python vis_bbox.py --folder <RGB-D FOLDER> --image_id <RGB_IMAGE_ID>


## Multi-object Tracking (MOT) using ByteTrack and MOTChallenge evaluation

The following Colab downloads CityLifeSim into your drive, applies a SoTA MOT and evaluate it. We leverage the work of (Zhang et al., 2021). For more details please refer to the paper or dive into the code…

## Contributors

Cheng Yao Wang, Eyal Ofek, Daniel McDuff, Oron Nir, Sai Vemprala, Ashish Kapoor, Mar Gonzalez-Franco Microsoft Research