are you me?    id passwd

status  

 choosing the third option

picture

 

 for a binary question.

calender

Install RDKIT - 컴퓨터

1. install boost using python3
ref: https://github.com/pupil-labs/pupil/issues/874, huangjiancong1

tar -xzvf boost_1_65_1.tar.gz
cd boost_1_65_1
echo "using mpi ;
using gcc : : g++ ;
using python : 3.6 : /usr/bin/python3 : /usr/include/python3.6m : /usr/local/lib ;" > ~/user-config.jam

./bootstrap.sh --with-python=/usr/bin/python3 --with-python-version=3.6 --with-python-root=/usr/local/lib/python3.6 --prefix=/usr/local
sudo ./b2 install -a --with=all

2. install rdkit
modifying CMakeList boost version 1.5.1 to the installed version of boost.
change all the path below!
cmake version needs to be ~= 3.1

cmake -DPYTHON_LIBRARY=/usr/lib/python3.6/config/libpython3.6.a \
-DPYTHON_INCLUDE_DIR=/usr/include/python3.6/ \
-DPYTHON_EXECUTABLE=/usr/bin/python3 \
-DBOOST_LIBRARIES=libboost_python3.so.1.65.1 \
-DBoost_INCLUDE_DIR=include_foldr ..

3. Add rdkitpath to PYTHONPATH, libpath to LD_LIBRARY_PATH

written time : 2020-09-15 20:18:59.0

Turning off TF2 auto-sharding warning - 컴퓨터

https://github.com/tensorflow/tensorflow/issues/42146#issuecomment-671484239

Message: "Consider either turning off auto-sharding or switching the auto_shard_policy to DATA to shard this dataset."
If your Tensorflow scripts leave this log message, then it falls back to use DATA type sharding. Thus, to turn off the log message you can set auto_shard_policy to DATA using tf.data.Options() as follows:

options = tf.data.Options()
options.experimental_distribute.auto_shard_policy = tf.data.experimental.AutoShardPolicy.DATA
dataset = dataset.with_options(options)

written time : 2020-09-14 19:15:49.0

Turning on Tensorflow XLA - 컴퓨터

Version: f1f85733439920c031519042e312d6bbe60f5f93

TF_PATH=`pip show tensorflow | grep Location | cut -f2 -d" "`
export TF_XLA_FLAGS="--tf_xla_enable_xla_devices=true --tf_xla_auto_jit=2 --tf_xla_cpu_global_jit" $TF_PATH

written time : 2020-09-13 14:46:16.0
...  1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |  ...