Posts by Tags

AI

Discrimination in AI and DeepFakes

Published:

Introduction

Artificial intelligence (AI) algorithms are being used in our day-to-day tasks, from search engines, email filtering to content recommendation. AI media coverage can get us to believe in two scenarios. A positive scenario where AI automates all the tasks and solves all problems, and a doomsday scenario where AI takes over humanity [1]. However, this coverage rarely engages in a constructive conversation about the realistic dangers that come with AI and how AI might impact us in the context of society, politics, economics, gender, race, sexual-orientation, social class, and so on [1]. One aspect of AI’s impact on our societies is the consolidation of the existing power dynamics. Research has shown that AI-based systems are prone to reproducing harmful social biases [34]. Which in turn leads to the consolidation of dysfunctional social structures that favour historically advantaged people, like favouring men over women for STEM related jobs [2]. Deepfake videos and the AI systems that make them are another manifestation of this consolidation of power, with the potential risks they impose, especially on women.

Deepfake applications use off-the-shelf AI algorithms to generate fake content. AI algorithms like generative adversarial networks (GAN), variational autoencoders (VAE), and long short-term memory (LSTM) are used in training deepfake applications to swap the persons’ faces in two different videos, or to copy the facial expressions of one person in one video onto the person in the other video. The open-source deepfake applications, FaceSwap and DeepFaceLab, use VAE algorithms [5]. These deepfake applications, similar to other AI-based systems, don’t actually “learn” anything about the task they are supposed to do. But rather, they learn spurious correlations between a group of variables present in the training datasets [6]. AI-based companies claim that their systems make the right decisions, but with no guarantees that they do that for the right reasons. Hence, the term “Black box” is used to describe AI-based systems.

Academia

Whistling Vivaldi: How Stereotypes affect us and what we can do about them?

Published:

I received the “Whistling Vivaldi: how stereotypes affect us and what we can do’’ book by Claude Steele as a Christmas gift and it was the best gift I got this year. The book explains, in a scientific way backed with experiments and publications, how negative stereotypes and stigmatizations related to certain groups influence the intellectual and physical ability of the people who belong to those groups. I found the book clarifying to me a lot of what I went through as a female Egyptian Ph.D. student in Computer Science in the UK. I could relate to the feelings, the struggles, and the experiences of the students who participated in the experiments described in the book. I finished the book in a week and decided to write this summary to spread knowledge.

Bias

Discrimination in AI and DeepFakes

Published:

Introduction

Artificial intelligence (AI) algorithms are being used in our day-to-day tasks, from search engines, email filtering to content recommendation. AI media coverage can get us to believe in two scenarios. A positive scenario where AI automates all the tasks and solves all problems, and a doomsday scenario where AI takes over humanity [1]. However, this coverage rarely engages in a constructive conversation about the realistic dangers that come with AI and how AI might impact us in the context of society, politics, economics, gender, race, sexual-orientation, social class, and so on [1]. One aspect of AI’s impact on our societies is the consolidation of the existing power dynamics. Research has shown that AI-based systems are prone to reproducing harmful social biases [34]. Which in turn leads to the consolidation of dysfunctional social structures that favour historically advantaged people, like favouring men over women for STEM related jobs [2]. Deepfake videos and the AI systems that make them are another manifestation of this consolidation of power, with the potential risks they impose, especially on women.

Deepfake applications use off-the-shelf AI algorithms to generate fake content. AI algorithms like generative adversarial networks (GAN), variational autoencoders (VAE), and long short-term memory (LSTM) are used in training deepfake applications to swap the persons’ faces in two different videos, or to copy the facial expressions of one person in one video onto the person in the other video. The open-source deepfake applications, FaceSwap and DeepFaceLab, use VAE algorithms [5]. These deepfake applications, similar to other AI-based systems, don’t actually “learn” anything about the task they are supposed to do. But rather, they learn spurious correlations between a group of variables present in the training datasets [6]. AI-based companies claim that their systems make the right decisions, but with no guarantees that they do that for the right reasons. Hence, the term “Black box” is used to describe AI-based systems.

Crowdflower

Guidelines to Figure-Eight (CrowdFlower) Platform

Published:

How does Figure-Eight works?

Figure-Eight is based on the idea of creating millions of simple online tasks to create a distributed computing machine. The business model of Figure-Eight is composed on a customer, a task and a contributor. Figure-Eight provides tools for the customer to upload their data and design their task either by drag and drop or using Crow-Flower Mark up Language (CML). Then, the task became available for the contributors (crowd) to perform. Figure-Eight provides a set of tools to set the desired quality, speed and cost of the task. This section describes the mechanisms provided to achieve the most possible quality in the task.

Crowdsourcing

Guidelines to Figure-Eight (CrowdFlower) Platform

Published:

How does Figure-Eight works?

Figure-Eight is based on the idea of creating millions of simple online tasks to create a distributed computing machine. The business model of Figure-Eight is composed on a customer, a task and a contributor. Figure-Eight provides tools for the customer to upload their data and design their task either by drag and drop or using Crow-Flower Mark up Language (CML). Then, the task became available for the contributors (crowd) to perform. Figure-Eight provides a set of tools to set the desired quality, speed and cost of the task. This section describes the mechanisms provided to achieve the most possible quality in the task.

Data annotation

Crowdsourcing

Published:

What is crowdsourcing?

“Remember outsourcing? Sending jobs to India and China is so 2003. The new pool of cheap labor: everyday people using their spare cycles to create content, solve problems, even do corporate R & D.” By Jeff Howe

Deep Fakes

Discrimination in AI and DeepFakes

Published:

Introduction

Artificial intelligence (AI) algorithms are being used in our day-to-day tasks, from search engines, email filtering to content recommendation. AI media coverage can get us to believe in two scenarios. A positive scenario where AI automates all the tasks and solves all problems, and a doomsday scenario where AI takes over humanity [1]. However, this coverage rarely engages in a constructive conversation about the realistic dangers that come with AI and how AI might impact us in the context of society, politics, economics, gender, race, sexual-orientation, social class, and so on [1]. One aspect of AI’s impact on our societies is the consolidation of the existing power dynamics. Research has shown that AI-based systems are prone to reproducing harmful social biases [34]. Which in turn leads to the consolidation of dysfunctional social structures that favour historically advantaged people, like favouring men over women for STEM related jobs [2]. Deepfake videos and the AI systems that make them are another manifestation of this consolidation of power, with the potential risks they impose, especially on women.

Deepfake applications use off-the-shelf AI algorithms to generate fake content. AI algorithms like generative adversarial networks (GAN), variational autoencoders (VAE), and long short-term memory (LSTM) are used in training deepfake applications to swap the persons’ faces in two different videos, or to copy the facial expressions of one person in one video onto the person in the other video. The open-source deepfake applications, FaceSwap and DeepFaceLab, use VAE algorithms [5]. These deepfake applications, similar to other AI-based systems, don’t actually “learn” anything about the task they are supposed to do. But rather, they learn spurious correlations between a group of variables present in the training datasets [6]. AI-based companies claim that their systems make the right decisions, but with no guarantees that they do that for the right reasons. Hence, the term “Black box” is used to describe AI-based systems.

Discrimination

Discrimination in AI and DeepFakes

Published:

Introduction

Artificial intelligence (AI) algorithms are being used in our day-to-day tasks, from search engines, email filtering to content recommendation. AI media coverage can get us to believe in two scenarios. A positive scenario where AI automates all the tasks and solves all problems, and a doomsday scenario where AI takes over humanity [1]. However, this coverage rarely engages in a constructive conversation about the realistic dangers that come with AI and how AI might impact us in the context of society, politics, economics, gender, race, sexual-orientation, social class, and so on [1]. One aspect of AI’s impact on our societies is the consolidation of the existing power dynamics. Research has shown that AI-based systems are prone to reproducing harmful social biases [34]. Which in turn leads to the consolidation of dysfunctional social structures that favour historically advantaged people, like favouring men over women for STEM related jobs [2]. Deepfake videos and the AI systems that make them are another manifestation of this consolidation of power, with the potential risks they impose, especially on women.

Deepfake applications use off-the-shelf AI algorithms to generate fake content. AI algorithms like generative adversarial networks (GAN), variational autoencoders (VAE), and long short-term memory (LSTM) are used in training deepfake applications to swap the persons’ faces in two different videos, or to copy the facial expressions of one person in one video onto the person in the other video. The open-source deepfake applications, FaceSwap and DeepFaceLab, use VAE algorithms [5]. These deepfake applications, similar to other AI-based systems, don’t actually “learn” anything about the task they are supposed to do. But rather, they learn spurious correlations between a group of variables present in the training datasets [6]. AI-based companies claim that their systems make the right decisions, but with no guarantees that they do that for the right reasons. Hence, the term “Black box” is used to describe AI-based systems.

Elections

Spatial and Temporal analysis of Electoral violence of the Kenyan Elections 2008

Published:

In Kenya, after the 2007 presidential elections, the country was drowned in cycles of intensive violence for two months, resulting in 1200 killed and 100,000 displaced (Brown & Sriram, 2012). Kenya is one of the limited case studies in the literature where satellite images were used to detect the locations of violent incidents. This case study can help us to use spatial analysis to understand the diffusion pattern of electoral violence in Kenya and which approach of the aforementioned diffusion approaches, escalation, relocation and violence legitimacy, is applied.

Spatial and Temporal analysis of Electoral violence of the Nigerian Elections 2003

Published:

Nigeria in 2003 held more than one election in the same month.It started with the House of the representative and Senate elections on the same day, 12th of April 2003. Then the presidential election was held on 19th of April 2003. Then on the 3rd of May 2003, the National Assembly election was conducted. According to the data collected by HRW, violence started earlier in Delta state from January to March as it is the most oil-producing state in Nigeria. The violent incidents included clashes between different ethnic groups and other clashes between armed groups and security forces. The most violent incidents were on 12th and 19th of April and the 3rd of May (on election days(s) violence). In most of the incidents the ruling party was the main perpetrator. Some incidents started earlier on 10th and 11th and some took place during the first three days of May. The violence started on 10th of April in Bassambiri area in Bayelsa state in the form of clashing thugs belong to opponent parties. On the 11th different violent incidents took place in different areas (Amadi-amain in River state and Warri in Delta state) in the form of the temporary displacement of people from their homes and attacks waged by the opponent’s party’s supporters.

Figure-Eight

Guidelines to Figure-Eight (CrowdFlower) Platform

Published:

How does Figure-Eight works?

Figure-Eight is based on the idea of creating millions of simple online tasks to create a distributed computing machine. The business model of Figure-Eight is composed on a customer, a task and a contributor. Figure-Eight provides tools for the customer to upload their data and design their task either by drag and drop or using Crow-Flower Mark up Language (CML). Then, the task became available for the contributors (crowd) to perform. Figure-Eight provides a set of tools to set the desired quality, speed and cost of the task. This section describes the mechanisms provided to achieve the most possible quality in the task.

Ground truth

Crowdsourcing

Published:

What is crowdsourcing?

“Remember outsourcing? Sending jobs to India and China is so 2003. The new pool of cheap labor: everyday people using their spare cycles to create content, solve problems, even do corporate R & D.” By Jeff Howe

Kenya

Spatial and Temporal analysis of Electoral violence of the Kenyan Elections 2008

Published:

In Kenya, after the 2007 presidential elections, the country was drowned in cycles of intensive violence for two months, resulting in 1200 killed and 100,000 displaced (Brown & Sriram, 2012). Kenya is one of the limited case studies in the literature where satellite images were used to detect the locations of violent incidents. This case study can help us to use spatial analysis to understand the diffusion pattern of electoral violence in Kenya and which approach of the aforementioned diffusion approaches, escalation, relocation and violence legitimacy, is applied.

Machine learning

Crowdsourcing

Published:

What is crowdsourcing?

“Remember outsourcing? Sending jobs to India and China is so 2003. The new pool of cheap labor: everyday people using their spare cycles to create content, solve problems, even do corporate R & D.” By Jeff Howe

Minorities

Whistling Vivaldi: How Stereotypes affect us and what we can do about them?

Published:

I received the “Whistling Vivaldi: how stereotypes affect us and what we can do’’ book by Claude Steele as a Christmas gift and it was the best gift I got this year. The book explains, in a scientific way backed with experiments and publications, how negative stereotypes and stigmatizations related to certain groups influence the intellectual and physical ability of the people who belong to those groups. I found the book clarifying to me a lot of what I went through as a female Egyptian Ph.D. student in Computer Science in the UK. I could relate to the feelings, the struggles, and the experiences of the students who participated in the experiments described in the book. I finished the book in a week and decided to write this summary to spread knowledge.

Negative Stereotypes

Whistling Vivaldi: How Stereotypes affect us and what we can do about them?

Published:

I received the “Whistling Vivaldi: how stereotypes affect us and what we can do’’ book by Claude Steele as a Christmas gift and it was the best gift I got this year. The book explains, in a scientific way backed with experiments and publications, how negative stereotypes and stigmatizations related to certain groups influence the intellectual and physical ability of the people who belong to those groups. I found the book clarifying to me a lot of what I went through as a female Egyptian Ph.D. student in Computer Science in the UK. I could relate to the feelings, the struggles, and the experiences of the students who participated in the experiments described in the book. I finished the book in a week and decided to write this summary to spread knowledge.

Nigeria

Spatial and Temporal analysis of Electoral violence of the Nigerian Elections 2003

Published:

Nigeria in 2003 held more than one election in the same month.It started with the House of the representative and Senate elections on the same day, 12th of April 2003. Then the presidential election was held on 19th of April 2003. Then on the 3rd of May 2003, the National Assembly election was conducted. According to the data collected by HRW, violence started earlier in Delta state from January to March as it is the most oil-producing state in Nigeria. The violent incidents included clashes between different ethnic groups and other clashes between armed groups and security forces. The most violent incidents were on 12th and 19th of April and the 3rd of May (on election days(s) violence). In most of the incidents the ruling party was the main perpetrator. Some incidents started earlier on 10th and 11th and some took place during the first three days of May. The violence started on 10th of April in Bassambiri area in Bayelsa state in the form of clashing thugs belong to opponent parties. On the 11th different violent incidents took place in different areas (Amadi-amain in River state and Warri in Delta state) in the form of the temporary displacement of people from their homes and attacks waged by the opponent’s party’s supporters.

Stereotype Threat

Whistling Vivaldi: How Stereotypes affect us and what we can do about them?

Published:

I received the “Whistling Vivaldi: how stereotypes affect us and what we can do’’ book by Claude Steele as a Christmas gift and it was the best gift I got this year. The book explains, in a scientific way backed with experiments and publications, how negative stereotypes and stigmatizations related to certain groups influence the intellectual and physical ability of the people who belong to those groups. I found the book clarifying to me a lot of what I went through as a female Egyptian Ph.D. student in Computer Science in the UK. I could relate to the feelings, the struggles, and the experiences of the students who participated in the experiments described in the book. I finished the book in a week and decided to write this summary to spread knowledge.

Stigma

Whistling Vivaldi: How Stereotypes affect us and what we can do about them?

Published:

I received the “Whistling Vivaldi: how stereotypes affect us and what we can do’’ book by Claude Steele as a Christmas gift and it was the best gift I got this year. The book explains, in a scientific way backed with experiments and publications, how negative stereotypes and stigmatizations related to certain groups influence the intellectual and physical ability of the people who belong to those groups. I found the book clarifying to me a lot of what I went through as a female Egyptian Ph.D. student in Computer Science in the UK. I could relate to the feelings, the struggles, and the experiences of the students who participated in the experiments described in the book. I finished the book in a week and decided to write this summary to spread knowledge.

Violence

Spatial and Temporal analysis of Electoral violence of the Kenyan Elections 2008

Published:

In Kenya, after the 2007 presidential elections, the country was drowned in cycles of intensive violence for two months, resulting in 1200 killed and 100,000 displaced (Brown & Sriram, 2012). Kenya is one of the limited case studies in the literature where satellite images were used to detect the locations of violent incidents. This case study can help us to use spatial analysis to understand the diffusion pattern of electoral violence in Kenya and which approach of the aforementioned diffusion approaches, escalation, relocation and violence legitimacy, is applied.

Spatial and Temporal analysis of Electoral violence of the Nigerian Elections 2003

Published:

Nigeria in 2003 held more than one election in the same month.It started with the House of the representative and Senate elections on the same day, 12th of April 2003. Then the presidential election was held on 19th of April 2003. Then on the 3rd of May 2003, the National Assembly election was conducted. According to the data collected by HRW, violence started earlier in Delta state from January to March as it is the most oil-producing state in Nigeria. The violent incidents included clashes between different ethnic groups and other clashes between armed groups and security forces. The most violent incidents were on 12th and 19th of April and the 3rd of May (on election days(s) violence). In most of the incidents the ruling party was the main perpetrator. Some incidents started earlier on 10th and 11th and some took place during the first three days of May. The violence started on 10th of April in Bassambiri area in Bayelsa state in the form of clashing thugs belong to opponent parties. On the 11th different violent incidents took place in different areas (Amadi-amain in River state and Warri in Delta state) in the form of the temporary displacement of people from their homes and attacks waged by the opponent’s party’s supporters.

spatial analysis

Spatial and Temporal analysis of Electoral violence of the Kenyan Elections 2008

Published:

In Kenya, after the 2007 presidential elections, the country was drowned in cycles of intensive violence for two months, resulting in 1200 killed and 100,000 displaced (Brown & Sriram, 2012). Kenya is one of the limited case studies in the literature where satellite images were used to detect the locations of violent incidents. This case study can help us to use spatial analysis to understand the diffusion pattern of electoral violence in Kenya and which approach of the aforementioned diffusion approaches, escalation, relocation and violence legitimacy, is applied.

Spatial and Temporal analysis of Electoral violence of the Nigerian Elections 2003

Published:

Nigeria in 2003 held more than one election in the same month.It started with the House of the representative and Senate elections on the same day, 12th of April 2003. Then the presidential election was held on 19th of April 2003. Then on the 3rd of May 2003, the National Assembly election was conducted. According to the data collected by HRW, violence started earlier in Delta state from January to March as it is the most oil-producing state in Nigeria. The violent incidents included clashes between different ethnic groups and other clashes between armed groups and security forces. The most violent incidents were on 12th and 19th of April and the 3rd of May (on election days(s) violence). In most of the incidents the ruling party was the main perpetrator. Some incidents started earlier on 10th and 11th and some took place during the first three days of May. The violence started on 10th of April in Bassambiri area in Bayelsa state in the form of clashing thugs belong to opponent parties. On the 11th different violent incidents took place in different areas (Amadi-amain in River state and Warri in Delta state) in the form of the temporary displacement of people from their homes and attacks waged by the opponent’s party’s supporters.

temporal analysis

Spatial and Temporal analysis of Electoral violence of the Kenyan Elections 2008

Published:

In Kenya, after the 2007 presidential elections, the country was drowned in cycles of intensive violence for two months, resulting in 1200 killed and 100,000 displaced (Brown & Sriram, 2012). Kenya is one of the limited case studies in the literature where satellite images were used to detect the locations of violent incidents. This case study can help us to use spatial analysis to understand the diffusion pattern of electoral violence in Kenya and which approach of the aforementioned diffusion approaches, escalation, relocation and violence legitimacy, is applied.

Spatial and Temporal analysis of Electoral violence of the Nigerian Elections 2003

Published:

Nigeria in 2003 held more than one election in the same month.It started with the House of the representative and Senate elections on the same day, 12th of April 2003. Then the presidential election was held on 19th of April 2003. Then on the 3rd of May 2003, the National Assembly election was conducted. According to the data collected by HRW, violence started earlier in Delta state from January to March as it is the most oil-producing state in Nigeria. The violent incidents included clashes between different ethnic groups and other clashes between armed groups and security forces. The most violent incidents were on 12th and 19th of April and the 3rd of May (on election days(s) violence). In most of the incidents the ruling party was the main perpetrator. Some incidents started earlier on 10th and 11th and some took place during the first three days of May. The violence started on 10th of April in Bassambiri area in Bayelsa state in the form of clashing thugs belong to opponent parties. On the 11th different violent incidents took place in different areas (Amadi-amain in River state and Warri in Delta state) in the form of the temporary displacement of people from their homes and attacks waged by the opponent’s party’s supporters.