Commit 1a7537a6 authored by Hannes Diedrich's avatar Hannes Diedrich
Browse files

Merge branch 'feature/merge_tifs' into 'master'

Feature/merge tifs

See merge request !6
parents ddd78468 38433daa
Pipeline #2603 canceled with stages
in 5 minutes and 46 seconds
......@@ -7,6 +7,9 @@ Status
[Coverage report](http://gts2.gitext.gfz-potsdam.de/gts2_client/coverage/)
## Release notes
* **2018-02-16:** Added installation information and installation script for miniconda and all needed packages
* **2018-01-18:** Added docker file and script for building an compatible image and running a container for the client
* **2018-01-17:** Added option for mosaicing/merging tifs and RGBs on client side
* **2018-01-10:** Added output of RGB images into jpg or png, nc-output still in progress
## Description
......@@ -16,8 +19,8 @@ time of interest and wanted band combination and saves them to geotiff (.tiff)
or alternatively as .json file or as netcdf (.nc) file.
## Requirements
1. Access to GTS2 API (username and password)
1. Python 3
1. Access to GTS2 API (username, password, port)
1. Python3
1. Python packages:
* numpy
* gdal
......@@ -33,21 +36,55 @@ Clone the repository with:
</code>
## Installation
1. Make sure your system meets all the Requirements (see above)
* For GFZ users: you can achieve that by using the gfz python.
For that please run: <code>module load pygfz</code>
2. Install the package by first going into the repository folder and then run:
<code> python gts2_client/setup.py install </code>
3. Create a credentials file in your home directory **credentials_gts2_client**
that contains the following structure:
### ... is easy (as long your python is compatible):
* Make sure your system meets all the requirements (see above)
* Install the package by running:
`python gts2_client/setup.py install`
### Install a compatible python:
The following instruction is only valid for Linux distributions but can be adapted on Windows machines.
Everything is based on bash shell.
#### EITHER: Use install script and follow instructions:
Run: `bash install_py_gts2_client.sh`
After providing the installation path the script will install miniconda
and all necessary packages as well as the gts2_client.
At the script will provide instructions which PATHs to add to your .bashrc.
#### OR: Per hand (expert modus):
* Download and install at least Miniconda with default settings:
```
wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh
chmod 755 Miniconda3-latest-Linux-x86_64.sh
./Miniconda3-latest-Linux-x86_64.sh -b
rm -f Miniconda3-latest-Linux-x86_64.sh
```
* Install necessary packages for gts2_client using **gts2_client_conda_install.yml**:
`conda env update -f gts2_client_conda_install.yml`
### Finally:
Create a credentials file in your home directory **credentials_gts2_client**
that contains the following structure:
```bash
{"user": "",
"password": ""}
"password": ""
"port": ""}
```
Please fill in your credentials (provided by GFZ Potsdam, please contact gts2@gfz-potsdam.de).
### If you want to use docker
**!! Attention: Expert mode !! root access to computer is needed.**
You only have to run **docker_gts2_client/build_run_gts2_client_docker.sh**
It creates an image that is based on a Centos:7, performs all needed installation steps and starts a container.
You end up with a shell where you can run `gts2_client`.
The building of the image can take some time (up to 30 Minutes), but once it is done,
**docker_gts2_client/build_run_gts2_client_docker.sh** skips the build and only starts the container.
## Usage
### As command line tool:
......@@ -56,7 +93,7 @@ gts2_client.py required_arguments [optional_arguments]
```
The list of arguments including their default values can be called from the command line with:
<code>gts2_client --help</code>
`gts2_client --help`
#### Arguments:
##### Required:
......@@ -108,8 +145,14 @@ The list of arguments including their default values can be called from the comm
* -q RGB_BANDS_SELECTION, --rgb_bands_selection RGB_BANDS_SELECTION
band selection for rgb production, choose from:
realistic, nice_looking, vegetation,
healthy_vegetation_urban, water, snow, agriculture
healthy_vegetation_urban, snow, agriculture
(default: realistic)
* -w MERGE_TIFS, --merge_tifs MERGE_TIFS
Merge tifs and RGBs if area in two or more MGRS tiles
per time step (True or False). (default: False)
* -x MERGE_TILE, --merge_tile MERGE_TILE
Choose MGRS tile into which the merge of files is
performed (e.g. 33UUV). (default: None)
### As python package
......@@ -122,3 +165,6 @@ result = gts2_client.client(out_mode="python", **kwargs)
## Limitations:
* Bands with spatial resolution 60m (Band 1) are not stacked.
* Requests with areas larger than 0.2°x0.2° will probably not be processed
if you also request a large time range this threshold can also be smaller
{
"user": "",
"password": ""
"password": "",
"port": 80
}
\ No newline at end of file
#!/usr/bin/env bash
context_dir="./context"
dockerfile="gts2_client.docker"
runner_os="centos"
runner_iname="gts2_client_runner"
runner_tag="${runner_os}:${runner_iname}"
container_name="gts2_client"
cred_file="$HOME/credentials_gts2_client"
out_data_folder="/tmp/gts2_client"
#Check if image exists
if [ $(sudo docker images | grep ${runner_iname} | wc -l) == 0 ]
then
# Copying credentials file into context dir, if exists
if [ -e ${cred_file} ]
then
echo "cp ${cred_file} ./context/"
cp ${cred_file} ./context/
else
echo "read"
read -p "Please enter GTS2 username: " username
read -p "Please enter GTS2 password: " password
read -p "Please enter GTS2 port (default 80): " port
cat > ${cred_file} <<EOL
{
"user": "${username}",
"password": "${password}",
"port": ${port}
}
EOL
fi
# build docker image
sudo docker build -f ${context_dir}/${dockerfile} -m 20G -t ${runner_tag} ${context_dir}
fi
#run container and start a shell where you can run the gts2_client
sudo docker rm -f ${container_name}
mkdir -p ${out_data_folder}
echo "Starting gts2_container, please write files to ${out_data_folder} (setting the -o option of client accordingly)"
sudo docker run -it --name ${container_name} -v ${out_data_folder}:${out_data_folder} ${runner_tag} \
bash -i -c "cd /home/gts2_client; git pull origin master; source ~/anaconda3/bin/activate; python setup.py install;
echo "";echo "";echo "";echo "";echo '#######';echo 'This is a shell were you can run the gts2_client :::';echo '#######'; bash"
\ No newline at end of file
FROM centos:7
RUN yum update -y && \
yum install -y wget vim bzip2 git
ENV anaconda_dl='Anaconda3-5.0.1-Linux-x86_64.sh'
RUN /bin/bash -i -c "wget https://repo.continuum.io/archive/$anaconda_dl && \
bash ./$anaconda_dl -b && \
rm -f /root/$anaconda_dl"
RUN /bin/bash -i -c "source ~/anaconda3/bin/activate && \
conda install --yes -q -c conda-forge gdal 'icu=58.*' lxml pyqt && \
pip install netCDF4 && \
conda update -q --all"
RUN /bin/bash -i -c "cd /home/ && \
git clone https://gitext.gfz-potsdam.de/gts2/gts2_client.git"
COPY credentials_gts2_client /root/credentials_gts2_client
\ No newline at end of file
......@@ -10,69 +10,203 @@ import time
import logging
import tempfile
import shutil
from glob import glob
import os
import numpy as np
import gdal
import collections
import re
import gdal
import numpy as np
import netCDF4 as nc4
from glob import glob
from os.path import isfile
from os.path import join
from os.path import expanduser
from scipy.ndimage.interpolation import zoom
from skimage.exposure import rescale_intensity, adjust_gamma
from scipy.misc import imsave
from os.path import isfile
from os.path import join
from os.path import dirname
from os import makedirs
from subprocess import Popen, PIPE
band_settings = {"realistic": ("B04", "B03", "B02"),
"nice_looking": ("B11", "B08", "B03"),
"vegetation": ("B8A", "B04", "B03"),
"healthy_vegetation_urban": ("B8A", "B11", "B02"),
"water": ("B08", "B12", "B01"),
"snow": ("B08", "B11", "B04"),
"agriculture": ("B11", "B08", "B02")}
warnings.filterwarnings("ignore")
class gts2_request(dict):
def __init__(self, ll, ur, auth, bands="B02_B04_B08", version="0.6", date_from="20160107", date_to="20160507",
minimum_fill="1.0", sensor="S2A", level="L2A", suffix="", max_cloudy="1.0", utm_zone="", logger=None):
class Gts2Request(dict):
if auth[0] == "agricircle":
address = "https://rz-vm175.gfz-potsdam.de:4444/AC"
else:
address = "https://rz-vm175.gfz-potsdam.de:80/AC"
def check_geo(self, ll=(), ur=()):
"""
Check if area of request is not too large.
:param ll: tuple
:param ur: tuple
:return:
"""
thres = 0.4
latrange = ur[1]-ll[1]
lonrange = ur[0]-ll[0]
if (lonrange >= thres) | (latrange >= thres):
raise ValueError("Your requestst area too large: ({lon:7.5f}°x{lat:7.5f}° exceeds {thres}°x{thres}°)"
"".format(lat=latrange, lon=lonrange, thres=thres))
def __init__(self, opts, logger=None):
self.check_geo(ll=opts["ll"], ur=opts["ur"])
address = "https://rz-vm175.gfz-potsdam.de:{port}/AC".format(port=opts["auth"]["port"])
# test parameters such that api string formatting wont fail
assert len(ur) == 2
assert len(ll) == 2
assert sensor in ["S2A", "S2all"]
assert level in ["L1C", "L2A"]
# define api pattern
if level == "L1C":
self.api_fmt = "".join([
"{address}/{bands}/{date_from}_{date_to}/{ll_lon:.5f}_{ur_lon:.5f}_{ll_lat:.5f}_{ur_lat:.5f}",
"?sensor={sensor}&level={level}&minimum_fill={minimum_fill}&utm_zone={utm_zone}"])
else:
self.api_fmt = "".join([
"{address}/{bands}/{date_from}_{date_to}/{ll_lon:.5f}_{ur_lon:.5f}_{ll_lat:.5f}_{ur_lat:.5f}",
"?minimum_fill={minimum_fill}&sensor={sensor}&level={level}&version={version}&suffix={suffix}&"
"max_cloudy={max_cloudy}&utm_zone={utm_zone}"
])
assert len(opts["ur"]) == 2
assert len(opts["ll"]) == 2
assert opts["sensor"] in ["S2A", "S2B", "S2all"]
assert opts["level"] in ["L1C", "L2A"]
self.api_fmt = "{address}/{bands}/{date_from}_{date_to}/{ll_lon:.5f}_{ur_lon:.5f}_{ll_lat:.5f}_{ur_lat:.5f}" \
"?minimum_fill={minimum_fill}&sensor={sensor}&level={level}&version={version}&suffix={suffix}" \
"&max_cloudy={max_cloudy}&utm_zone={utm_zone}"
# construct api call
self.api_call = self.api_fmt.format(address=address, bands=bands, date_from=date_from, date_to=date_to,
suffix=suffix,
minimum_fill=minimum_fill, sensor=sensor, level=level, version=version,
ll_lon=ll[0], ll_lat=ll[1],
ur_lon=ur[0], ur_lat=ur[1], max_cloudy=max_cloudy, utm_zone=utm_zone)
self.api_call = self.api_fmt.format(address=address, bands=opts["bands"], date_from=opts["date_from"],
date_to=opts["date_to"], suffix=opts["suffix"], level=opts["level"],
minimum_fill=opts["minimum_fill"], sensor=opts["sensor"],
version=opts["version"], max_cloudy=opts["max_cloudy"],
ll_lon=opts["ll"][0], ll_lat=opts["ll"][1],
ur_lon=opts["ur"][0], ur_lat=opts["ur"][1], utm_zone=opts["utm_zone"])
logger.info(">>>>>> API-call: %s " % self.api_call)
# get data, update dict from json
self.update(requests.get(self.api_call, verify=False, auth=auth).json())
result = requests.get(self.api_call, verify=False, auth=opts["auth"]["auth"])
status = {"http_code": result.status_code}
self.update(result.json())
self.update(status)
def __spawn(cmd, cwd):
"""
Runs shell command (cmd) in the working directory (cwd).
:param cmd: string
:param cwd: string
:return:
"""
pp = Popen(cmd, stdout=PIPE, stderr=PIPE, shell=True, cwd=cwd)
stdout, stderr = pp.communicate()
if pp.returncode != 0:
raise ValueError(
"cmd:%s,rc:%s,stderr:%s,stdout:%s" % (cmd, str(pp.returncode), str(stderr), str(stdout)))
def __merge_tifs_same_timestamp(tifs, outpath, target_tile=""):
"""
Transforms all tifs from timestep to same utm zon (from target tile) and merges them together.
:param outpath: string
:param tifs: List of strings or None
:param target_tile: 5-element string
:return:
"""
basepath = os.path.dirname(tifs[0])
utm = target_tile[0:2]
base_split = os.path.basename(tifs[0]).split("_")
merge_file = "{path}/{pref}_{tile}_{suff}".format(path=outpath, pref="_".join(base_split[0:4]), tile=target_tile,
suff="_".join(base_split[5::]))
with tempfile.TemporaryDirectory(dir=basepath) as tmp_dir:
for fi in tifs:
if target_tile not in fi:
outfile = "{tmp_dir}/{orig_fn}_{tile}.tif".format(orig_fn=os.path.basename(fi.split(".")[0]),
tile=target_tile, tmp_dir=tmp_dir)
cmd = "gdalwarp -t_srs '+proj=utm +zone={utm} +datum=WGS84' -overwrite {infile} {outfile} ".format(
infile=fi, outfile=outfile, utm=utm)
__spawn(cmd, tmp_dir)
else:
shutil.copy(fi, tmp_dir)
merge_list = " ".join(glob(os.path.join(tmp_dir, "*.tif")))
cmd = "gdal_merge.py -o {merge_file} {merge_list}".format(merge_file=merge_file, merge_list=merge_list)
__spawn(cmd, tmp_dir)
return merge_file
def __find_double_timestamp(files):
"""
Finds filenames with doublicate time stamp (but different MGRS tile) in list of files.
:param files: list of strings
:return: dict
"""
basepath = os.path.dirname(files[0])
prefix = "_".join(os.path.basename(files[0]).split("_")[0:3])
bands = list(set(["_".join(os.path.basename(fa).split("_")[5::]).split(".")[0] for fa in files]))
double_dict = {}
for bi in bands:
band_files = [s for s in files if "_{band}.".format(band=bi) in s]
dates = [os.path.basename(bf).split("_")[3] for bf in band_files]
double_dates = ([item for item, count in collections.Counter(dates).items() if count > 1])
double_files = {}
for di in double_dates:
double_files[di] = glob(os.path.join(basepath, "{pref}*_{date}*{band}.*".format(
band=bi, date=di, pref=prefix)))
double_dict[bi] = double_files
return double_dict
def merge_tiles(tif_list, out_mode="single", target_tile=None):
"""
Performs merge of tifs of the same timestep that are devided into two different MGRS tiles
:param tif_list: list of strings
:param out_mode: string
:param target_tile: 5-element string or None
:return:
"""
stack = True if out_mode == "stack" else False
double_dict = __find_double_timestamp(tif_list)
for bi in double_dict.keys():
for dates, tifs_one_date in double_dict[bi].items():
basepath = os.path.dirname(tifs_one_date[0])
if len(tifs_one_date) > 1:
if target_tile is None:
tiles = [re.search('[0-9]{2,2}[A-Z]{3,3}', ti).group(0).replace("_", "") for ti in tifs_one_date]
target_tile = list(set(tiles))[0]
if stack is False:
with tempfile.TemporaryDirectory(dir=basepath) as parent_dir:
_ = [shutil.move(fm, parent_dir) for fm in tifs_one_date]
tifs_all = glob(os.path.join(parent_dir, "*"))
if len(tifs_all) == 0:
raise FileNotFoundError("No files found in {dir}".format(dir=parent_dir))
bands = list(set(["_".join(os.path.basename(fa).split("_")[6::]).strip(".tif") for fa in tifs_all]))
for bi in bands:
tifs = glob(os.path.join(parent_dir, "S2*{band}*.tif".format(band=bi)))
if len(tifs) != 0:
__merge_tifs_same_timestamp(tifs, basepath, target_tile=target_tile)
else:
with tempfile.TemporaryDirectory(dir=basepath) as parent_dir:
_ = [shutil.move(fm, parent_dir) for fm in tifs_one_date]
tifs_all = glob(os.path.join(parent_dir, "*"))
if len(tifs_all) == 0:
raise FileNotFoundError("No files found in {dir}".format(dir=parent_dir))
__merge_tifs_same_timestamp(tifs_all, basepath, target_tile=target_tile)
else:
raise ValueError("List of tifs to merge to short")
def mean_rgb_comb(infiles, rgb_comb=("B04", "B03", "B02"), bad_data_value=np.NaN):
......@@ -153,7 +287,7 @@ def get_lim(limfn, rgb_comb=("B04", "B03", "B02"), bad_data_value=np.NaN):
def mk_rgb(basedir, outdir, rgb_comb=("B04", "B03", "B02"), rgb_gamma=(1.0, 1.0, 1.0), extension="jpg",
resample_order=1, rgb_type=np.uint8, bad_data_value=np.NaN):
resample_order=1, rgb_type=np.uint8, bad_data_value=np.NaN, logger=None):
"""
Generation of RGB Images from Sentinel-2 data. Use hist_stretch = 'single', if to process a single image.
Use hist_stretch = 'series', if to process a sequence of images of the same tile but different dates.
......@@ -169,10 +303,8 @@ def mk_rgb(basedir, outdir, rgb_comb=("B04", "B03", "B02"), rgb_gamma=(1.0, 1.0,
:return: output directory and filename
"""
# Option 1 for a series of images
fnout_list = []
infiles = glob(os.path.join(basedir, "*stack*.tif"))
infiles = glob(os.path.join(basedir, "*.tif"))
if len(infiles) == 0:
raise FileExistsError("Could not find any stacked tif files in {dir}".format(dir=basedir))
......@@ -193,7 +325,6 @@ def mk_rgb(basedir, outdir, rgb_comb=("B04", "B03", "B02"), rgb_gamma=(1.0, 1.0,
fn_out = join(outdir, "{pref}_RGB_{bands}.{ext}".format(pref="_".join(split), bands="_".join(rgb_comb),
ext=extension))
if isfile(fn_out) is False:
# Create output array
ds = gdal.Open(fin, gdal.GA_ReadOnly)
sdata = ds.GetRasterBand(1)
......@@ -202,18 +333,13 @@ def mk_rgb(basedir, outdir, rgb_comb=("B04", "B03", "B02"), rgb_gamma=(1.0, 1.0,
S2_rgb = np.zeros(output_shape + [len(rgb_comb), ], dtype=rgb_type)
if ds is None:
raise ValueError("Can not load {fn}".format(fn=fin))
#
for i_rgb, (band, gamma) in enumerate(zip(rgb_comb, rgb_gamma)):
data = ds.GetRasterBand(i_rgb + 1).ReadAsArray() / 10000.0
# Determine stretching limits
if band == rgb_comb[0]:
lim = limits[0]
if band == rgb_comb[1]:
lim = limits[1]
if band == rgb_comb[2]:
lim = limits[2]
lim = limits[rgb_comb.index(band)]
# Calculate and apply zoom factor to input image
zoom_factor = np.array(output_shape) / np.array(data[:, :].shape)
......@@ -224,17 +350,14 @@ def mk_rgb(basedir, outdir, rgb_comb=("B04", "B03", "B02"), rgb_gamma=(1.0, 1.0,
# Rescale intensity to range (0.0, 255.0)
bf = rescale_intensity(image=zm, in_range=lim, out_range=(0.0, 255.0))
# Create output image of rgb type and apply gamma correction
S2_rgb[:, :, i_rgb] = np.array(bf, dtype=rgb_type)
if gamma != 0.0:
S2_rgb[:, :, i_rgb] = np.array(
adjust_gamma(np.array(S2_rgb[:, :, i_rgb], dtype=np.float32),
gamma), dtype=rgb_type)
S2_rgb[:, :, i_rgb] = np.array(adjust_gamma(bf, gamma), dtype=rgb_type)
else:
S2_rgb[:, :, i_rgb] = np.array(bf, dtype=rgb_type)
del data
del ds
# Save output rgb image
makedirs(dirname(fn_out), exist_ok=True)
imsave(fn_out, S2_rgb)
fnout_list.append(fn_out)
......@@ -245,19 +368,21 @@ def json_to_tiff(out_mode, api_result, only_tile, outpath, out_prefix, wl, level
"""
Get data from json dict and save if as singletif files OR:
save all requested bands plus cloudmask as one tiff file per date and tile.
:param out_mode:
:param api_result:
:param only_tile:
:param outpath:
:param out_prefix:
:param wl:
:param level:
:param stack_resolution:
:param bands:
:param logger:
:return:
:type out_mode: string
:type api_result: json sring
:type only_tile: string
:type outpath: string
:type out_prefix: string list of
:type wl: dict
:type level: string
:type stack_resolution: string
:type bands: string
:type logger: logger
:return: List of written tifs (list of strings)
"""
tif_list = []
if out_mode == "single":
for tile_key in api_result['Results'].keys():
if not ((only_tile != "") & (tile_key != only_tile)):
......@@ -301,6 +426,7 @@ def json_to_tiff(out_mode, api_result, only_tile, outpath, out_prefix, wl, level
img.FlushCache()
del img
tif_list.append(outfile)
# create tif file for clm
if level != "L1C":
......@@ -322,6 +448,7 @@ def json_to_tiff(out_mode, api_result, only_tile, outpath, out_prefix, wl, level
"2016/06 and 2017/12.")
img.FlushCache()
del img
tif_list.append(clm_outfile)
if out_mode == "stack":
for tile_key in api_result['Results'].keys():
......@@ -377,11 +504,11 @@ def json_to_tiff(out_mode, api_result, only_tile, outpath, out_prefix, wl, level
geotrans = (x_min, int(stack_resolution), 0, y_max, 0, -int(stack_resolution))
driver = gdal.GetDriverByName("GTiff")
if level == "L1C":
outfile = "{path}/{sensor}_{level}_{pref}_{date}_{tile}_stack_{band}_{res}m.tif".format(
outfile = "{path}/{sensor}_{level}_{pref}_{date}_{tile}_{band}_{res}m.tif".format(
path=outpath, pref=out_prefix, date=ac_date, tile=tile_key, band=bands, level=level,
res=stack_resolution, sensor=sensor)
else:
outfile = "{path}/{sensor}_{level}_{pref}_{date}_{tile}_stack_{band}_MSK_{res}m.tif".format(
outfile = "{path}/{sensor}_{level}_{pref}_{date}_{tile}_{band}_MSK_{res}m.tif".format(
path=outpath, pref=out_prefix, date=ac_date, tile=tile_key, band=bands, level=level,
res=stack_resolution, sensor=sensor)
img = driver.Create(outfile, cols, rows, count_bands + 1, gdal.GDT_Int32)
......@@ -414,6 +541,10 @@ def json_to_tiff(out_mode, api_result, only_tile, outpath, out_prefix, wl, level
img.FlushCache()
del img
tif_list.append(outfile)
return tif_list
def json_to_netcdf(out_mode, api_result, outpath, out_prefix, geo_ll, geo_ur, start_date, end_date, bands, level, wl):
"""
......@@ -514,34 +645,68 @@ def json_to_netcdf(out_mode, api_result, outpath, out_prefix, geo_ll, geo_ur, st
raise Exception("Something went wrong while saving as netcdf. " + traceback.format_exc())
def __get_auth(logger=None):
"""
Gets auth and port
:param logger: logger
:return: dict
"""
home_dir = expanduser("~")
cred_file = "%s/credentials_gts2_client" % home_dir
if len(glob(cred_file)) == 1:
with open(cred_file, "r") as cf:
cred_dict = json.load(cf)
if "port" in cred_dict.keys():
port = cred_dict["port"]
else:
port = 80
auth = (cred_dict["user"], cred_dict["password"])
else:
logger.error("You did not save your credentials in %s." % cred_file)
user = input("gts2_client - Please insert username: ")
password = input("gts2_client - Please insert password: ")
port = input("gts2_client - Please insert port for API (for default, type 'yes'): ")
if port == "yes":
port = "80"
auth = (user, password)
return {"auth": auth, "port": port}
def client(outpath="", out_prefix="", out_mode="json", geo_ll=(), geo_ur=(), sensor="S2A", bands="", max_cloudy="0.5",
level="L2A", start_date="", end_date="", version="0.12", suffix="", minimum_fill="",
only_tile="", stack_resolution="", quiet=False, rgb_extension="jpg", rgb_bands_selection="realistic"):
only_tile="", stack_resolution="10", quiet=False, rgb_extension="jpg", rgb_bands_selection="realistic",
merge_tifs=False, merge_tile=None):
"""