301-350 of 5971 results (29ms)
2023-12-01 §
10:04 <btullis> marked TaskInstance: pageview_hourly.move_data_to_archive scheduled__2023-12-01T06:00:00+00:00 as succeeded in airflow analytics [analytics]
2023-11-30 §
17:41 <btullis> reran refine_event for mediawiki_cirrussearch_request [analytics]
08:28 <stevemunene> reimage druid1010 to pick up the right raid config and corresponding partman recipe T336043 [analytics]
2023-11-29 §
17:10 <btullis> depool schema2004 for reimage to bookworm for T349286 [analytics]
17:07 <btullis> pooled schema2003 after reimages a bookworm [analytics]
15:30 <btullis> depool schema2003 for upgrade to bookworm [analytics]
15:24 <btullis> pooled schema1004 after upgrade to bookworm for T349286 [analytics]
14:44 <btullis> reimaging schema1004 to bookworm for T349286 [analytics]
14:43 <btullis> depooling schema1004 for reimage T349286 [analytics]
14:41 <btullis> pooled schema1003 after upgrade to bookeworm [analytics]
14:10 <btullis> reimaging schema1003 to bookworm for T349286 [analytics]
14:04 <btullis> depooling schema1003 for reimage T349286 [analytics]
14:01 <btullis> increased the size of the vg0/srv logical volume on an-web1001 by 350 GB for T349889 [analytics]
2023-11-28 §
18:30 <milimetric> deployed refinery to hdfs [analytics]
2023-11-27 §
21:03 <btullis> deploying airflow-dags to analytics_test instance [analytics]
15:05 <stevemunene> pool druid1007 after bullseye reimage T332589 [analytics]
13:27 <stevemunene> reimage druid1007 to upgrade to bullseye T332589 [analytics]
2023-11-24 §
12:34 <joal> Rerun webrequest refine text for 2023-11-23T17 [analytics]
06:07 <stevemunene> pool druid1008 after reimage T332589 [analytics]
2023-11-23 §
14:58 <btullis> merging 974649: Remove all remaining references to oozie and clean up | https://gerrit.wikimedia.org/r/c/operations/puppet/+/974649 for T341893 [analytics]
14:12 <btullis> roll-restarting hadoop masters on test cluster for T341893 [analytics]
12:44 <btullis> removing oozie configuration from core hadoop files with https://gerrit.wikimedia.org/r/c/operations/puppet/+/974647 for T341893 [analytics]
11:05 <gehel> testing SAL and logging [analytics]
2023-11-22 §
16:27 <joal> Kill duplicated XMLDumpsConverter [analytics]
15:39 <btullis> updating default airflow configuration with https://gerrit.wikimedia.org/r/c/operations/puppet/+/976700 [analytics]
12:22 <btullis> applying security patches to postgres13 on an-db1001 [analytics]
2023-11-21 §
15:04 <stevemunene> pool druid1011 after reimage T336043 [analytics]
2023-11-20 §
16:43 <mforns> reran Airflow's refine_webrequest_hourly_text::refine_webrequest with excluded_row_ids for 2023-11-19T21 [analytics]
2023-11-19 §
08:15 <mforns> reran Airflow's refine_webrequest_hourly_text::refine_webrequest with excluded_row_ids for 2023-11-19T00 [analytics]
2023-11-18 §
21:57 <mforns> eran Airflow's refine_webrequest_hourly_text::refine_webrequest with excluded_row_ids for 2023-11-18T12 [analytics]
19:47 <mforns> reran Airflow's refine_webrequest_hourly_text::refine_webrequest with excluded_row_ids for 2023-11-17T22 [analytics]
2023-11-17 §
14:58 <mforns> marked several failed tasks of datahub_ingestion DAG in Airflow, because the issues were fixed, added notes to the DAG itself [analytics]
12:55 <joal> Rerun Airflow metadata_ingest_daily datahub job [analytics]
2023-11-16 §
14:45 <btullis> rolling out 974993: Add spark.sql.warehouse.dir to spark3 defaults | https://gerrit.wikimedia.org/r/c/operations/puppet/+/974993 for T349523 [analytics]
13:22 <sergi0> stat1008: Add `sowiki`, `stwiki`, `tgwiki` and `ugwiki` to `/srv/published/datasets/one-off/research-mwaddlink/wikis.txt` (T340944) [analytics]
2023-11-15 §
20:44 <xcollazo> Ran 'sudo -u analytics hdfs dfs -rm -r -skipTrash /user/hive/warehouse/wmf_dumps.db/wikitext_raw_rc1' to delete HDFS data of old release candidate table [analytics]
20:43 <xcollazo> Ran 'sudo -u analytics hdfs dfs -rm -r -skipTrash /wmf/data/wmf_dumps/wikitext_raw_rc0' to delete HDFS data of old release candidate table [analytics]
20:42 <xcollazo> Ran 'DROP TABLE wmf_dumps.wikitext_raw_rc0' and 'DROP TABLE wmf_dumps.wikitext_raw_rc1' to delete older release candidate tables. [analytics]
14:51 <ottomata> deployed refine using refinery-job 0.2.26 JsonSchemaConverter from wikimedia-event-utilities - https://phabricator.wikimedia.org/T321854 [analytics]
14:33 <joal> Deploy refinery onto HDFS (unique-devices hotfix) [analytics]
13:44 <joal> Deploying refinery for unique-devices hotfix [analytics]
11:22 <btullis> exiting safe mode [analytics]
11:06 <btullis> merged all config files changes replacing an-coord1001 with an-mariadb1001 [analytics]
11:04 <btullis> position confirmed, resetting all slaves on an-mariadb1001 for T284150 [analytics]
11:02 <btullis> set an-coord1001 mysql to read_only [analytics]
11:01 <btullis> entering HDFS safe mode [analytics]
11:01 <btullis> proceeding with the implementation plan here: https://phabricator.wikimedia.org/T284150#9330525 [analytics]
10:43 <btullis> temporarily disabled production jobs that write to HDFS [analytics]
2023-11-14 §
20:35 <sfaci> recreated unique_devices iceberg tables [analytics]
20:35 <sfaci> restarted Druid supervisors [analytics]