251-300 of 6069 results (34ms)
2024-02-06 §
13:39 <brouberol> add new TLS SANs to the superset/superset-next certificates in dse-k8s-eqiad - T356481 [analytics]
13:29 <stevemunene> roll restart hadoop masters to pick up the right rack assignment for new hosts T353776 [analytics]
11:45 <stevemunene> add new an-workers to analytics_cluster hadoop worker role analytics_cluster::hadoop::worker T353776 [analytics]
11:03 <btullis> reimaging an-web1001 to bullseye for T349398 [analytics]
2024-02-05 §
14:07 <btullis> deploying conda-analytics version 0.0.28 to hadoop-all for T345482 [analytics]
13:50 <brouberol> increasing pod & container limits in the dse-k8s-eqiad superset/superset-next namespaces - T352166 [analytics]
12:37 <btullis> roll-restarting druid analtyics workers for T356382 [analytics]
12:35 <btullis> deploying conda-analytics version 0.0.28 to hadoop-test [analytics]
2024-02-02 §
10:27 <btullis> correction: reimaging an-airflow1002to bullseye for T335261 [analytics]
10:27 <btullis> reimaging an-airflow10042to bullseye for T335261 [analytics]
09:46 <btullis> reimaging an-airflow1004 to bullseye for T335261 [analytics]
2024-02-01 §
13:40 <btullis> roll-restarting zookeeper on druid-analyticsfor T356382 [analytics]
13:34 <btullis> roll-restarting zookeeper on druid-public for T356382 [analytics]
13:25 <btullis> roll-restarting zookeeper on an-conf* for T356382 [analytics]
12:35 <joal> Rerun refinery-sqoop-whole-mediawiki after hotfix [analytics]
12:30 <joal> hotfix HDFS sqoop list to prevent an entire redeploy [analytics]
12:08 <joal> Restart refinery-sqoop-whole-mediawiki.service after deploy [analytics]
11:29 <phuedx> Deployed refinery onto hdfs [analytics]
11:21 <btullis> deploying the new spark-operator images based on JRE 8 for T354273 [analytics]
10:55 <phuedx> phuedx@deploy2002 Finished deploy [analytics/refinery@0d8e976] (hadoop-test): Remove trvwikisource from scoop list (duration: 03m 30s) [analytics]
10:51 <phuedx> phuedx@deploy2002 Started deploy [analytics/refinery@0d8e976] (hadoop-test): Remove trvwikisource from scoop list [analytics]
10:51 <phuedx> phuedx@deploy2002 Finished deploy [analytics/refinery@0d8e976] (thin): Remove trvwikisource from scoop list (duration: 00m 05s) [analytics]
10:51 <phuedx> phuedx@deploy2002 Started deploy [analytics/refinery@0d8e976] (thin): Remove trvwikisource from scoop list [analytics]
10:50 <phuedx> phuedx@deploy2002 Finished deploy [analytics/refinery@0d8e976]: analytics/refinery: Remove trvwikisource from scoop list (duration: 10m 20s) [analytics]
10:39 <phuedx> phuedx@deploy2002 Started deploy [analytics/refinery@0d8e976]: analytics/refinery: Remove trvwikisource from scoop list [analytics]
10:08 <btullis> deploying Superset 3.1.0 to an-tool1010 with https://gerrit.wikimedia.org/r/c/analytics/superset/deploy/+/994213 [analytics]
09:49 <joal> deploying airflow for interlanguage_navigation in Iceberg [analytics]
2024-01-31 §
19:59 <joal> Deploying refinery with scap for second hotfix [analytics]
19:14 <joal> Backfill wmf_traffic.aqs_hourly [analytics]
19:14 <joal> Drop/Recreate wmf_traffic.aqs_hourly table (iceberg) to change compression format [analytics]
18:40 <phuedx> phuedx@deploy2002 Finished deploy [airflow-dags/analytics@5078a6b]: (no justification provided) (duration: 00m 28s) [analytics]
18:40 <phuedx> phuedx@deploy2002 Started deploy [airflow-dags/analytics@5078a6b]: (no justification provided) [analytics]
17:46 <phuedx> Deployed refinery using scap, then deployed onto hdfs [analytics]
17:40 <joal> pause pageview_actor_hourly for deploy [analytics]
17:35 <phuedx> phuedx@deploy2002 Finished deploy [analytics/refinery@bef134c] (hadoop-test): Regular analytics weekly train TEST [analytics/refinery@bef134c2] (duration: 03m 29s) [analytics]
17:31 <phuedx> phuedx@deploy2002 Started deploy [analytics/refinery@bef134c] (hadoop-test): Regular analytics weekly train TEST [analytics/refinery@bef134c2] [analytics]
17:31 <phuedx> phuedx@deploy2002 Finished deploy [analytics/refinery@bef134c] (thin): Regular analytics weekly train THIN [analytics/refinery@bef134c2] (duration: 00m 08s) [analytics]
17:31 <phuedx> phuedx@deploy2002 Started deploy [analytics/refinery@bef134c] (thin): Regular analytics weekly train THIN [analytics/refinery@bef134c2] [analytics]
17:30 <phuedx> phuedx@deploy2002 Finished deploy [analytics/refinery@bef134c]: Regular analytics weekly train [analytics/refinery@bef134c2] (duration: 11m 05s) [analytics]
17:19 <phuedx> phuedx@deploy2002 Started deploy [analytics/refinery@bef134c]: Regular analytics weekly train [analytics/refinery@bef134c2] [analytics]
17:02 <phuedx> phuedx@deploy2002 Finished deploy [analytics/refinery@2c00cad] (hadoop-test): Regular analytics weekly train TEST [analytics/refinery@2c00cad1] (duration: 03m 35s) [analytics]
17:00 <phuedx> phuedx@deploy2002 Started deploy [analytics/refinery@2c00cad] (hadoop-test): Regular analytics weekly train TEST [analytics/refinery@2c00cad1] [analytics]
16:57 <phuedx> phuedx@deploy2002 Finished deploy [analytics/refinery@2c00cad] (thin): Regular analytics weekly train THIN [analytics/refinery@2c00cad1] (duration: 00m 06s) [analytics]
16:57 <phuedx> phuedx@deploy2002 Started deploy [analytics/refinery@2c00cad] (thin): Regular analytics weekly train THIN [analytics/refinery@2c00cad1] [analytics]
16:53 <phuedx> phuedx@deploy2002 Finished deploy [analytics/refinery@2c00cad]: Regular analytics weekly train [analytics/refinery@2c00cad1] (duration: 09m 52s) [analytics]
16:52 <phuedx> Regular analytics weekly train [analytics/refinery@$(git rev-parse --short HEAD)] [analytics]
12:12 <btullis> rebooting dbstore1009 for new kernel version (T356239) [analytics]
11:56 <btullis> rebooting dbstore1008 for new kernel version (T356239) [analytics]
10:57 <btullis> deploying https://gerrit.wikimedia.org/r/c/analytics/superset/deploy/+/994213 to superset-next to test nested display of presto columns [analytics]
2024-01-30 §
18:48 <xcollazo> ran the following commands to create a production test dump folder: [analytics]